To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Healthcare personnel with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection were interviewed to describe activities and practices in and outside the workplace. Among 2,625 healthcare personnel, workplace-related factors that may increase infection risk were more common among nursing-home personnel than hospital personnel, whereas selected factors outside the workplace were more common among hospital personnel.
Hospitals are required to have antibiotic stewardship programs (ASPs), but there are few models for implementing ASPs without the support of an infectious disease (ID) specialist, defined as an ID physician and/or ID pharmacist.
In this study, we sought to understand ASP implementation at hospitals that lack on-site ID support within the Veterans’ Health Administration (VHA).
Using a mandatory VHA survey, we identified acute-care hospitals that lacked an on-site ID specialist. We conducted semistructured interviews with personnel involved in ASP activities.
The study was conducted across 7 VHA hospitals.
In total, 42 hospital personnel were enrolled in the study.
The primary responsibility for ASPs fell on the pharmacist champions, who were typically assigned multiple other non-ASP responsibilities. The pharmacist champions were more successful at gaining buy-in when they had established rapport with clinicians, but at some sites, the use of contract physicians and frequent staff turnover were potential barriers. Some sites felt that having access to an off-site ID specialist was important for overcoming institutional barriers and improving the acceptance of their stewardship recommendations. In general, stewardship champions struggled to mobilize institutional resources, which made it difficult to advance their programmatic goals.
In this study of 7 hospitals without on-site ID support, we found that ASPs are largely a pharmacy-driven process. Remote ID support, if available, was seen as helpful for implementing stewardship interventions. These findings may inform the future implementation of ASPs in settings lacking local ID expertise.
The availability of colonizable substrate is an important driver of the temporal dynamics of sessile invertebrates on coral reefs. Increased dominance of algae and, in some cases, sponges has been documented on many coral reefs around the world, but how these organisms benefit from non-colonized substrate on the reef is unclear. In this study, we described the temporal dynamics of benthic organisms on an Indonesian coral reef across two time periods between 2006 and 2017 (2006–2008 and 2014–2017), and investigated the effects of colonizable substrate on benthic cover of coral reef organisms at subsequent sampling events. In contrast with other Indonesian reefs where corals have been declining, corals were dominant and stable over time at this location (mean ± SE percentage cover 42.7 ± 1.9%). Percentage cover of turf algae and sponges showed larger interannual variability than corals and crustose coralline algae (CCA) (P < 0.001), indicating that these groups are more dynamic over short temporal scales. Bare substrate was a good predictor of turf cover in the following year (mean effect 0.2, 95% CI: 0–0.4). Algal cover combined with bare space was a good predictor of CCA cover the following year generally, and of sponge cover the following year but only at one of the three sites. These results indicate that turf algae on some Indonesian reefs can rapidly occupy free space when this becomes available, and that other benthic groups are probably not limited by the availability of bare substrate, but may overgrow already fouled substrates.
Dietary interventions did not prevent depression onset nor reduced depressive symptoms in a large multi-center randomized controlled depression prevention study (MooDFOOD) involving overweight adults with subsyndromal depressive symptoms. We conducted follow-up analyses to investigate whether dietary interventions differ in their effects on depressive symptom profiles (mood/cognition; somatic; atypical, energy-related).
Baseline, 3-, 6-, and 12-month follow-up data from MooDFOOD were used (n = 933). Participants received (1) placebo supplements, (2) food-related behavioral activation (F-BA) therapy with placebo supplements, (3) multi-nutrient supplements (omega-3 fatty acids and a multi-vitamin), or (4) F-BA therapy with multi-nutrient supplements. Depressive symptom profiles were based on the Inventory of Depressive Symptomatology.
F-BA therapy was significantly associated with decreased severity of the somatic (B = −0.03, p = 0.014, d = −0.10) and energy-related (B = −0.08, p = 0.001, d = −0.13), but not with the mood/cognition symptom profile, whereas multi-nutrient supplementation was significantly associated with increased severity of the mood/cognition (B = 0.05, p = 0.022, d = 0.09) and the energy-related (B = 0.07, p = 0.002, d = 0.12) but not with the somatic symptom profile.
Differentiating depressive symptom profiles indicated that food-related behavioral interventions are most beneficial to alleviate somatic symptoms and symptoms of the atypical, energy-related profile linked to an immuno-metabolic form of depression, although effect sizes were small. Multi-nutrient supplements are not indicated to reduce depressive symptom profiles. These findings show that attention to clinical heterogeneity in depression is of importance when studying dietary interventions.
Emergency Medical Services (EMS) providers are trained to place endotracheal tubes (ETTs) in the prehospital setting when indicated. Endotracheal tube cuffs are traditionally inflated with 10cc of air to provide adequate seal against the tracheal lumen. There is literature suggesting that many ETTs are inflated well beyond the accepted safe pressures of 20-30cmH2O, leading to potential complications including ischemia, necrosis, scarring, and stenosis of the tracheal wall. Currently, EMS providers do not routinely check ETT cuff pressures. It was hypothesized that the average ETT cuff pressure of patients arriving at the study site who were intubated by EMS exceeds the safe pressure range of 20-30cmH2O.
While ETT cuff inflation is necessary to close the respiratory system, thus preventing air leaks and aspiration, there is evidence to suggest that over-inflated ETT cuffs can cause long-term complications. The purpose of this study is to characterize the cuff pressures of ETTs placed by EMS providers.
This project was a single center, prospective observational study. Endotracheal tube cuff pressures were measured and recorded for adult patients intubated by EMS providers prior to arrival at a large, urban, tertiary care center over a nine-month period. All data were collected by respiratory therapists utilizing a cuff pressure measurement device which had a detectable range of 0-100cmH2O and was designed as a syringe. Results including basic patient demographics, cuff pressure, tube size, and EMS service were recorded.
In total, 45 measurements from six EMS services were included with ETT sizes ranging from 6.5-8.0mm. Mean patient age was 52.2 years (67.7% male). Mean cuff pressure was 81.8cmH2O with a range of 15 to 100 and a median of 100. The mode was 100cmH2O; 40 out of 45 (88.9%) cuff pressures were above 30cmH2O. Linear regression showed no correlation between age and ETT cuff pressure or between ETT size and cuff pressure. Two-tailed T tests did not show a significant difference in the mean cuff pressure between female versus male patients.
An overwhelming majority of prehospital intubations are associated with elevated cuff pressures, and cuff pressure monitoring education is indicated to address this phenomenon.
We evaluated the risk of patients contracting coronavirus disease 2019 (COVID-19) during their hospital stay to inform the safety of hospitalization for a non–COVID-19 indication during this pandemic.
A case series of adult patients hospitalized for 2 or more nights from May 15 to June 15, 2020 at large tertiary-care hospital in the midwestern United States was reviewed. All patients were screened at admission with the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) polymerase chain reaction (PCR) test. Selected adult patients were also tested by IgG serology. After dismissal, patients with negative serology and PCR at admission were asked to undergo repeat serologic testing at 14–21 days after discharge. The primary outcome was healthcare-associated COVID-19 defined as a new positive SARS-CoV-2 PCR test on or after day 4 of hospital stay or within 7 days of hospital dismissal, or seroconversion in patients previously established as seronegative.
Of the 2,068 eligible adult patients, 1,778 (86.0%) completed admission PCR testing, while 1,339 (64.7%) also completed admission serology testing. Of the 1,310 (97.8%) who were both PCR and seronegative, 445 (34.0%) repeated postdischarge serology testing. No healthcare-associated COVID-19 cases were detected during the study period. Of 1,310 eligible PCR and seronegative adults, no patients tested PCR positive during hospital admission (95% confidence interval [CI], 0.0%–0.3%). Of the 445 (34.0%) who completed postdischarge serology testing, no patients seroconverted (0.0%; 95% CI, 0.0%–0.9%).
We found low likelihood of hospital-associated COVID-19 with strict adherence to universal masking, physical distancing, and hand hygiene along with limited visitors and screening of admissions with PCR.
Pharmacogenomic testing has emerged to aid medication selection for patients with major depressive disorder (MDD) by identifying potential gene-drug interactions (GDI). Many pharmacogenomic tests are available with varying levels of supporting evidence, including direct-to-consumer and physician-ordered tests. We retrospectively evaluated the safety of using a physician-ordered combinatorial pharmacogenomic test (GeneSight) to guide medication selection for patients with MDD in a large, randomized, controlled trial (GUIDED).
Materials and Methods
Patients diagnosed with MDD who had an inadequate response to ≥1 psychotropic medication were randomized to treatment as usual (TAU) or combinatorial pharmacogenomic test-guided care (guided-care). All received combinatorial pharmacogenomic testing and medications were categorized by predicted GDI (no, moderate, or significant GDI). Patients and raters were blinded to study arm, and physicians were blinded to test results for patients in TAU, through week 8. Measures included adverse events (AEs, present/absent), worsening suicidal ideation (increase of ≥1 on the corresponding HAM-D17 question), or symptom worsening (HAM-D17 increase of ≥1). These measures were evaluated based on medication changes [add only, drop only, switch (add and drop), any, and none] and study arm, as well as baseline medication GDI.
Most patients had a medication change between baseline and week 8 (938/1,166; 80.5%), including 269 (23.1%) who added only, 80 (6.9%) who dropped only, and 589 (50.5%) who switched medications. In the full cohort, changing medications resulted in an increased relative risk (RR) of experiencing AEs at both week 4 and 8 [RR 2.00 (95% CI 1.41–2.83) and RR 2.25 (95% CI 1.39–3.65), respectively]. This was true regardless of arm, with no significant difference observed between guided-care and TAU, though the RRs for guided-care were lower than for TAU. Medication change was not associated with increased suicidal ideation or symptom worsening, regardless of study arm or type of medication change. Special attention was focused on patients who entered the study taking medications identified by pharmacogenomic testing as likely having significant GDI; those who were only taking medications subject to no or moderate GDI at week 8 were significantly less likely to experience AEs than those who were still taking at least one medication subject to significant GDI (RR 0.39, 95% CI 0.15–0.99, p=0.048). No other significant differences in risk were observed at week 8.
These data indicate that patient safety in the combinatorial pharmacogenomic test-guided care arm was no worse than TAU in the GUIDED trial. Moreover, combinatorial pharmacogenomic-guided medication selection may reduce some safety concerns. Collectively, these data demonstrate that combinatorial pharmacogenomic testing can be adopted safely into clinical practice without risking symptom degradation among patients.
ABSTRACT IMPACT: This work examines the association between diabetes mellitus and latent tuberculosis infection among a cohort of household contacts exposed to active tuberculosis in Ethiopia, focusing attention on the need for further translational research to determine the mechanisms of susceptibility to Mycobacterium tuberculosis infection among individuals with diabetes and pre-diabetes. OBJECTIVES/GOALS: Diabetes mellitus (DM) is an established risk factor for active TB disease, but there is limited understanding of the relationship of DM and latent tuberculosis (LTBI). We sought to determine the relationship between DM or pre-DM with LTBI among household or close contacts (HHCs) of active TB cases in Ethiopia. METHODS/STUDY POPULATION: We conducted a cross-sectional study of the HHCs of index active TB cases enrolled in an ongoing TB Research Unit (TBRU) study in Addis Ababa, Ethiopia. HHCs of individuals with laboratory-confirmed TB had QuantiFERON ®-TB Gold Plus (QFT) and glycated hemoglobin (HbA1c) tests performed. LTBI was defined as a positive QFT and lack of symptoms. HbA1C results were used to define no DM (HbA1c <5.7), pre-DM (HbA1c 5.7-6.5%), and DM (HbA1c >6.5% or prior history of diabetes). Logistic regression was used to estimate adjusted odds ratios (OR) and 95% confidence intervals (CI) after adjustment for age, sex and HIV status as potential confounders. RESULTS/ANTICIPATED RESULTS: Among 466 HHCs, the median age was 29 years (IQR 23-38), 58.8% were female, 3.4% were HIV-positive, and median BMI was 20.9 kg/m^2 (IQR 18.9-23.8). Overall, 329 HHCs (70.6%) had LTBI, 26 (5.6%) had DM and 73 (15.7%) had pre-DM. Compared to HHC without DM, the prevalence of LTBI was higher in those with pre-DM (68.9% vs. 72.6%; OR 1.19, 95% CI 0.69-2.13) and those with DM (88.5%; OR 3.45, 95% CI 1.17-14.77). In multivariable analysis, there was a trend towards increased LTBI risk among HHCs with DM vs. without DM (OR 2.16, 95% CI 0.67-9.70) but the difference was not statistically significant. Among HHCs with LTBI, the median IFN-? response to TB1 antigen was modestly greater in those with DM (5.3 IU/mL; IQR 3.0-7.8) and pre-DM (5.4 IU/mL; IQR 2.0-8.4) compared to HHCs without DM (4.3 IU/mL; IQR 1.4-7.7). DISCUSSION/SIGNIFICANCE OF FINDINGS: Our results suggest that DM may increase the risk of LTBI among HHCs recently exposed to active TB. Among those with LTBI, increased IFN-? antigen response in the presence of DM and pre-DM may indicate an exaggerated but ineffectual response to TB. Further investigation is needed to assess how dysglycemia impacts susceptibility to M. tuberculosis.
The prevalence of attention deficit/hyperactivity disorder in the general population is common and is now diagnosed in 4%–12% of children. Children with CHD have been shown to be at increased risk for attention deficit/hyperactivity disorder. Case reports have led to concern regarding the use of attention deficit/hyperactivity disorder medications in children with underlying CHD. We hypothesised that medical therapy for patients with CHD and attention deficit/hyperactivity disorder is safe.
A single-centre, retrospective chart review was performed evaluating for adverse events in patients aged 4–21 years with CHD who received attention deficit/hyperactivity disorder therapy over a 5-year span. Inclusion criteria were a diagnosis of CHD and concomitant medical therapy with amphetamines, methylphenidate, or atomoxetine. Patients with trivial or spontaneously resolved CHD were excluded from analysis.
In 831 patients with CHD who received stimulants with a mean age of 12.9 years, there was only one adverse cardiovascular event identified. Using sensitivity analysis, our median follow-up time was 686 days and a prevalence rate of 0.21% of adverse events. This episode consisted of increased frequency of supraventricular tachycardia in a patient who had this condition prior to initiation of medical therapy; the condition improved with discontinuation of attention deficit/hyperactivity disorder therapy.
The incidence of significant adverse cardiovascular events in our population was similar to the prevalence of supraventricular tachycardia in the general population. Our single-centre experience demonstrated no increased risk in adverse events related to medical therapy for children with attention deficit/hyperactivity disorder and underlying CHD. Further population-based studies are indicated to validate these findings.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
Late Cretaceous tracks attributable to deinonychosaurs in North America are rare, with only one occurrence of Menglongipus from Alaska and two possible, but indeterminate, occurrences reported from Mexico. Here we describe the first probable deinonychosaur tracks from Canada: a possible trackway and one isolated track on a single horizon from the Upper Cretaceous Wapiti Formation (upper Campanian) near Grande Prairie in Alberta. The presence of a relatively short digit IV differentiates these from argued dromaeosaurid tracks, suggesting the trackmaker was more likely a troodontid. Other noted characteristics of the Wapiti specimens include a rounded heel margin, the absence of a digit II proximal pad impression, and a broad, elliptical digit III. Monodactyl tracks occur in association with the didactyl tracks, mirroring similar discoveries from the Early Cretaceous Epoch of China, providing additional support for their interpretation as deinonychosaurian traces. Although we refrain from assigning the new Wapiti specimens to any ichnotaxon because of their relatively poor undertrack preservation, this discovery is an important addition to the deinonychosaur track record; it helps to fill a poorly represented geographic and temporal window in their known distribution, and demonstrates the presence of a greater North American deinonychosaur ichnodiversity than has previously been recognized.
The goal of this study was to assess the utility of participatory needs assessment processes for continuous improvement of developing clinical and translational research (CTR) networks. Our approach expanded on evaluation strategies for CTR networks, centers, and institutes, which often survey stakeholders to identify infrastructure or resource needs, using the case example of the Great Plains IDeA-CTR Network. Our 4-stage approach (i.e., pre-assessment, data collection, implementation of needs assessment derived actions, monitoring of action plan) included a member survey (n = 357) and five subsequent small group sessions (n = 75 participants) to better characterize needs identified in the survey and to provide actionable recommendations. This participatory, mixed-methods needs assessment and strategic action planning process yielded 11 inter-related recommendations. These recommendations were presented to the CTR steering committee as inputs to develop detailed, prioritized action plans. Preliminary evaluation shows progress towards improved program capacity and effectiveness of the network to respond to member needs. The participatory, mixed-methods needs assessment and strategic planning process allowed a wide range of stakeholders to contribute to the development of actionable recommendations for network improvement, in line with the principles of team science.
Kernza® intermediate wheatgrass [Thinopyrum intermedium (Host) Barkworth & Dewey], the first perennial grain crop to come to market in North America, can provide a number of ecosystem services when integrated into cropping systems that are dominated by annual grain crops. However, grain yield from Kernza is lower than comparable annual cereal crops such as wheat and oats. Also, although Kernza is a long-lived perennial that can persist for decades, grain yield tends to decline over time as Kernza stands age leading most farmers to replant or rotate to a different crop after 3–5 yrs. Increased intraspecific competition as stand density increases with age has been reported to cause grain yield declines. We investigated the effect of strip-tillage applied at two different timings, between the third and fourth grain harvests, from a Kernza stand in upstate New York. Strip-tillage applied in late fall as plants were entering dormancy increased grain yield by 61% when compared to the control treatment without strip-tillage. However, total crop biomass was not reduced resulting in a greater harvest index for the fall strip-tillage treatment. Strip-tillage applied before stem elongation the following spring reduced overall tiller density and total crop biomass but did not impact tiller fertility or grain yield compared to the control treatment without strip-tillage. Increased grain yield in the fall strip-tillage treatment was due to an increase in the percentage of tillers that produced mature seedheads. This suggests that grain yield decline over time is at least partially caused by competition between tillers in dense stands. Results support further research and development of strip-tillage and other forms of managed disturbance as tools for maintaining Kernza grain yield over time.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
As the pathophysiology of Covid-19 emerges, this paper describes dysphagia as a sequela of the disease, including its diagnosis and management, hypothesised causes, symptomatology in relation to viral progression, and concurrent variables such as intubation, tracheostomy and delirium, at a tertiary UK hospital.
During the first wave of the Covid-19 pandemic, 208 out of 736 patients (28.9 per cent) admitted to our institution with SARS-CoV-2 were referred for swallow assessment. Of the 208 patients, 102 were admitted to the intensive treatment unit for mechanical ventilation support, of which 82 were tracheostomised. The majority of patients regained near normal swallow function prior to discharge, regardless of intubation duration or tracheostomy status.
Dysphagia is prevalent in patients admitted either to the intensive treatment unit or the ward with Covid-19 related respiratory issues. This paper describes the crucial role of intensive swallow rehabilitation to manage dysphagia associated with this disease, including therapeutic respiratory weaning for those with a tracheostomy.
Psychosis is associated with a reasoning bias, which manifests as a tendency to ‘jump to conclusions’. We examined this bias in people at clinical high-risk for psychosis (CHR) and investigated its relationship with their clinical outcomes.
In total, 303 CHR subjects and 57 healthy controls (HC) were included. Both groups were assessed at baseline, and after 1 and 2 years. A ‘beads’ task was used to assess reasoning bias. Symptoms and level of functioning were assessed using the Comprehensive Assessment of At-Risk Mental States scale (CAARMS) and the Global Assessment of Functioning (GAF), respectively. During follow up, 58 (16.1%) of the CHR group developed psychosis (CHR-T), and 245 did not (CHR-NT). Logistic regressions, multilevel mixed models, and Cox regression were used to analyse the relationship between reasoning bias and transition to psychosis and level of functioning, at each time point.
There was no association between reasoning bias at baseline and the subsequent onset of psychosis. However, when assessed after the transition to psychosis, CHR-T participants showed a greater tendency to jump to conclusions than CHR-NT and HC participants (55, 17, 17%; χ2 = 8.13, p = 0.012). There was a significant association between jumping to conclusions (JTC) at baseline and a reduced level of functioning at 2-year follow-up in the CHR group after adjusting for transition, gender, ethnicity, age, and IQ.
In CHR participants, JTC at baseline was associated with adverse functioning at the follow-up. Interventions designed to improve JTC could be beneficial in the CHR population.
Antarctica's ice shelves modulate the grounded ice flow, and weakening of ice shelves due to climate forcing will decrease their ‘buttressing’ effect, causing a response in the grounded ice. While the processes governing ice-shelf weakening are complex, uncertainties in the response of the grounded ice sheet are also difficult to assess. The Antarctic BUttressing Model Intercomparison Project (ABUMIP) compares ice-sheet model responses to decrease in buttressing by investigating the ‘end-member’ scenario of total and sustained loss of ice shelves. Although unrealistic, this scenario enables gauging the sensitivity of an ensemble of 15 ice-sheet models to a total loss of buttressing, hence exhibiting the full potential of marine ice-sheet instability. All models predict that this scenario leads to multi-metre (1–12 m) sea-level rise over 500 years from present day. West Antarctic ice sheet collapse alone leads to a 1.91–5.08 m sea-level rise due to the marine ice-sheet instability. Mass loss rates are a strong function of the sliding/friction law, with plastic laws cause a further destabilization of the Aurora and Wilkes Subglacial Basins, East Antarctica. Improvements to marine ice-sheet models have greatly reduced variability between modelled ice-sheet responses to extreme ice-shelf loss, e.g. compared to the SeaRISE assessments.
Cognitive behavior therapy (CBT) is effective for most patients with a social anxiety disorder (SAD) but a substantial proportion fails to remit. Experimental and clinical research suggests that enhancing CBT using imagery-based techniques could improve outcomes. It was hypothesized that imagery-enhanced CBT (IE-CBT) would be superior to verbally-based CBT (VB-CBT) on pre-registered outcomes.
A randomized controlled trial of IE-CBT v. VB-CBT for social anxiety was completed in a community mental health clinic setting. Participants were randomized to IE (n = 53) or VB (n = 54) CBT, with 1-month (primary end point) and 6-month follow-up assessments. Participants completed 12, 2-hour, weekly sessions of IE-CBT or VB-CBT plus 1-month follow-up.
Intention to treat analyses showed very large within-treatment effect sizes on the social interaction anxiety at all time points (ds = 2.09–2.62), with no between-treatment differences on this outcome or clinician-rated severity [1-month OR = 1.45 (0.45, 4.62), p = 0.53; 6-month OR = 1.31 (0.42, 4.08), p = 0.65], SAD remission (1-month: IE = 61.04%, VB = 55.09%, p = 0.59); 6-month: IE = 58.73%, VB = 61.89%, p = 0.77), or secondary outcomes. Three adverse events were noted (substance abuse, n = 1 in IE-CBT; temporary increase in suicide risk, n = 1 in each condition, with one being withdrawn at 1-month follow-up).
Group IE-CBT and VB-CBT were safe and there were no significant differences in outcomes. Both treatments were associated with very large within-group effect sizes and the majority of patients remitted following treatment.
With over a century of records, we present a detailed analysis of the spatial and temporal occurrence of marine turtle sightings and strandings in the UK and Ireland between 1910 and 2018. Records of hard-shell turtles, including loggerhead turtles (Caretta caretta, N = 240) and Kemp's ridley turtles (Lepidochelys kempii, N = 61), have significantly increased over time. However, in the most recent years there has been a notable decrease in records. The majority of records of hard-shell turtles were juveniles and occurred in the boreal winter months when the waters are coolest in the North-east Atlantic. They generally occurred on the western aspects of the UK and Ireland highlighting a pattern of decreasing records with increasing latitude, supporting previous suggestions that juvenile turtles arrive in these waters via the North Atlantic current systems. Similarly, the majority of the strandings and sightings of leatherback turtles (Dermochelys coriacea, N = 1683) occurred on the western aspects of the UK and the entirety of Ireland's coastline. In contrast to hard-shell turtles, leatherback turtles were most commonly recorded in the boreal summer months with the majority of strandings being adult sized, of which there has been a recent decrease in annual records. The cause of the recent annual decreases in turtle strandings and sightings across all three species is unclear; however, changes to overall population abundance, prey availability, anthropogenic threats and variable reporting effort could all contribute. Our results provide a valuable reference point to assess species range modification due to climate change, identify possible evidence of anthropogenic threats and to assess the future trajectory of marine turtle populations in the North Atlantic.