To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Between 2001 and 2017, the Royal Botanic Garden Edinburgh conducted training and research in Belize built around an annual two-week field course, part of the Edinburgh M.Sc. programme in Biodiversity and Taxonomy of Plants, focused on tropical plant identification, botanical-collecting and tropical fieldwork skills. This long-term collaboration in one country has led to additional benefits, most notably capacity building, acquisition of new country records, completion of M.Sc. thesis projects and publication of the findings in journal articles, and continued cooperation. Detailed summaries are provided for the specimens collected by students during the field course or return visits to Belize for M.Sc. thesis projects. Additionally, 15 species not recorded in the national checklist for Belize are reported. The information in this paper highlights the benefits of collaborations between institutions and countries for periods greater than the typical funding cycles of three to five years.
The Coronavirus (Covid-19) pandemic is exerting unprecedented pressure on NHS Health and Social Care provisions, with frontline staff, such as those of critical care units, encountering vast practical and emotional challenges on a daily basis. Although staff are being supported through organisational provisions, facilitated by those in leadership roles, the emergence of mental health difficulties or the exacerbation of existing ones amongst these members of staff is a cause for concern. Acknowledging this, academics and healthcare professionals alike are calling for psychological support for frontline staff, which not only addresses distress during the initial phases of the outbreak but also over the months, if not years, that follow. Fortunately, mental health services and psychology professional bodies across the United Kingdom have issued guidance to meet these needs. An attempt has been made to translate these sets of guidance into clinical provisions via the recently established Homerton Covid Psychological Support (HCPS) pathway delivered by Talk Changes (Hackney & City IAPT). This article describes the phased, stepped-care and evidence-based approach that has been adopted by the service to support local frontline NHS staff. We wish to share our service design and pathway of care with other Improving Access to Psychological Therapies (IAPT) services who may also seek to support hospital frontline staff within their associated NHS Trusts and in doing so, lay the foundations of a coordinated response.
Key learning aims
(1) To understand the ways staff can be psychologically and emotionally impacted by working on the frontline of disease outbreaks.
(2) To understand the ways in which IAPT services have previously supported populations exposed to crises.
(3) To learn ways of delivering psychological support and interventions during a pandemic context based on existing guidance and research.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Objective: Post-stroke cognitive impairment is common, but mechanisms and risk factors are poorly understood. Frailty may be an important risk factor for cognitive impairment after stroke. We investigated the association between pre-stroke frailty and acute post-stoke cognition. Methods: We studied consecutively admitted acute stroke patients in a single urban teaching hospital during three recruitment waves between May 2016 and December 2017. Cognition was assessed using the Mini-Montreal Cognitive Assessment (min=0; max=12). A Frailty Index was used to generate frailty scores for each patient (min=0; max=100). Clinical and demographic information were collected, including pre-stroke cognition, delirium, and stroke-severity. We conducted univariate and multiple-linear regression analyses with covariates forced in (covariates included were: age, sex, stroke severity, stroke-type, pre-stroke cognitive impairment, delirium, previous stroke/transient ischemic attack) to investigate the association between pre-stroke frailty and post-stroke cognition. Results: Complete data were available for 154 stroke patients. Mean age was 68 years (SD=11; range=32–97); 93 (60%) were male. Median mini-Montreal Cognitive Assessment score was 8 (IQR=4–12). Mean Frailty Index score was 18 (SD=11). Pre-stroke cognitive impairment was apparent in 13/154 (8%) patients. Pre-stroke frailty was significantly associated with lower post-stroke cognition (Standardized-Beta=−0.40; p<0.001) and this association was independent of covariates (Unstandardized-Beta=−0.05; p=0.005). Additional significant variables in the multiple regression model were age (Unstandardized-Beta=−0.05; p=0.002), delirium (Unstandardized-Beta=−2.81; p<0.001), pre-stroke cognitive impairment (Unstandardized-Beta=−2.28; p=0.001), and stroke-severity (Unstandardized-Beta=−0.20; p<0.001). Conclusions: Pre-stroke frailty may be a moderator of post-stroke cognition, independent of other well-established post-stroke cognitive impairment risk factors. (JINS, 2019, 25, 501–506)
Depression is a common post-stroke complication. Pre-stroke depression may be an important contributor, however the epidemiology of pre-stroke depression is poorly understood. Using systematic review and meta-analysis, we described the prevalence of pre-stroke depression and its association with post-stroke depression.
We searched multiple cross-disciplinary databases from inception to July 2017 and extracted data on the prevalence of pre-stroke depression and its association with post-stroke depression. We assessed the risk of bias (RoB) using validated tools. We described summary estimates of prevalence and summary odds ratio (OR) for association with post-stroke depression, using random-effects models. We performed subgroup analysis describing the effect of depression assessment method. We used a funnel plot to describe potential publication bias. The strength of evidence presented in this review was summarised via ‘GRADE’.
Of 11 884 studies identified, 29 were included (total participants n = 164 993). Pre-stroke depression pooled prevalence was 11.6% [95% confidence interval (CI) 9.2–14.7]; range: 0.4–24% (I2 95.8). Prevalence of pre-stroke depression varied by assessment method (p = 0.02) with clinical interview suggesting greater pre-stroke depression prevalence (~17%) than case-note review (9%) or self-report (11%). Pre-stroke depression was associated with increased odds of post-stroke depression; summary OR 3.0 (95% CI 2.3–4.0). All studies were judged to be at RoB: 59% of included studies had an uncertain RoB in stroke assessment; 83% had high or uncertain RoB for pre-stroke depression assessment. Funnel plot indicated no risk of publication bias. The strength of evidence based on GRADE was ‘very low’.
One in six stroke patients have had pre-stroke depression. Reported rates may be routinely underestimated due to limitations around assessment. Pre-stroke depression significantly increases odds of post-stroke depression.
Management of beef suckler cattle herds requires a difficult but vitally important balance between farm profits, animal health and welfare and sustainable food production. A dynamic programming (DP) model was implemented to investigate the consequences of replacement and management decisions on the interactions and possible trade-offs between animal welfare, fertility and profitability in breeding beef suckler cattle herds. The model maximized profit from the current cow and all successors by identifying the best keep/replace decision. The 150 states incorporated in the DP model were all combinations of: ten cow-parity, five calving periods including one barren state (five in total) as fertility indicators and three body condition scores at weaning as an animal welfare indicator reflecting feeding and nutritional conditions of animals. Statistical models were fitted to data from a breeding suckler cattle herd, consisting of performance records of 200 cattle over 5 years, to parameterize the DP model. Estimated parameters used in the DP model were: (i) probabilities of transitions between states and (ii) probability of involuntary culling. These estimates were used in the form of conditional probabilities of successful or failed (as a result of involuntary culling) transitions to the next state. In addition, statistical models were used to estimate probability of calving difficulty. There was strong evidence (P< 0·001) that parity affected calving difficulty and weak evidence (P = 0·067) that parity affected the incidence of involuntary culling. The DP model outcomes indicated that cows calving very early, i.e. those who conceived in the first 21 days after artificial insemination, showed reduced frequencies of calving difficulty as well as voluntary culling, and so gave better financial returns than late-calving cows and barren cows. As a result, fewer replacements were needed that reduced the frequency of calving difficulty, further implying a win–win scenario for both profit and welfare. In contrast, in late-calving animals, the frequency of calving difficulty increased and they were less profitable and more prone to be culled. Results of sensitivity analysis showed that the optimum voluntary culling rate was sensitive to commodity market prices. These findings suggest well-informed nutrition and reproduction management could deliver a win–win outcome for profit and animal welfare.
Background: It has been hypothesized that [18F]-sodium fluoride (NaF) uptake imaged with positron emission tomography (PET) binds to hydroxyapatite molecules expressed in regions with active calcification. Therefore, we aimed to validate NaF as a marker of hydroxyapatite expression in high-risk carotid plaque. Methods: Eleven patients (69 ± 5 years, 3 female) scheduled for carotid endarterectomy were prospectively recruited for NaF PET/CT. One patient received a second contralateral endarterectomy; two patients were excluded (intolerance to contrast media and PET/CT misalignment). The bifurcation of the common carotid was used as the reference point; NaF uptake (tissue to blood ratio - TBR) was measured at every PET slice extending 2 cm above and below the bifurcation. Excised plaque was immunostained with Goldner’s Trichrome and whole-slide digitized images were used to quantify hydroxyapatite expression. Pathology was co-registered with PET. Results: NaF uptake was related to the extent of hydroxyapatite expression (r=0.45, p<0.001). Upon classifying bilateral plaque for symptomatology, symptomatic plaque was associated with cerebrovascular events (3.75±1.1 TBR, n=9) and had greater NaF uptake than clinically silent asymptomatic plaque (2.79±0.6 TBR, n=11) (p=0.04). Conclusion: NaF uptake is related to hydroxyapatite expression and is increased in plaque associated with cerebrovascular events. NaF may serve as a novel biomarker of active calcification and plaque vulnerability.
A cardiac source is often implicated in strokes where the deficit includes aphasia. However, less is known about the etiology of isolated aphasia during transient ischemic attack (TIA). Our objective was to determine whether patients with isolated aphasia are likely to have a cardioembolic etiology for their TIA.
We prospectively studied a cohort of TIA patients in eight tertiary-care emergency departments. Patients with isolated aphasia were identified by the treating physician at the time of emergency department presentation. Patients with dysarthria (i.e., a phonation disturbance) were not included. Potential cardiac sources for embolism were defined as atrial fibrillation on history, electrocardiogram, Holter monitor, atrial fibrillation on echocardiography, or thrombus on echocardiography.
Of the 2,360 TIA patients identified, 1,155 had neurological deficits at the time of the emergency physician assessment and were included in this analysis, and 41 had isolated aphasia as their only neurological deficit. Patients with isolated aphasia were older (73.9±10.0 v. 67.2±14.5 years; p=0.003), more likely to have a history of heart failure (9.8% v. 2.6%; p=0.027), and were twice as likely to have any cardiac source of embolism (22.0% v. 10.6%; p=0.037).
Isolated aphasia is associated with a high rate of cardioembolic sources of embolism after TIA. Emergency patients with isolated aphasia diagnosed with a TIA warrant a rapid and thorough assessment for a cardioembolic source.
The incursion of Bluetongue disease into the UK and elsewhere in Northern Europe in 2008 raised concerns about maintaining an appropriate level of preparedness for the encroachment of exotic diseases as circumstances and risks change. Consequently the Scottish government commissioned the present study to inform policy on the specific threat of Bluetongue virus 8 (BTV8) incursion into Scotland. An interdisciplinary expert panel, including BTV and midge experts, agreed a range of feasible BTV incursion scenarios, patterns of disease spread and specific control strategies. The study was primarily desk-based, applying quantitative methodologies with existing models, where possible, and utilizing data already held by different members of the project team. The most likely distribution of the disease was explored given Scotland's agricultural systems, unique landscape and climate. Epidemiological and economic models are integrated in an ex-ante cost-benefit appraisal of successful prevention of hypothetical BTV8 incursion into Scotland under various feasible incursion scenarios identified by the interdisciplinary panel. The costs of current public and private surveillance efforts are compared to the benefits of the avoided losses of potential disease outbreaks. These avoided losses included the direct costs of alternative vaccination, protection zone (PZ) strategies and their influence on other costs arising from an outbreak as predicted by the epidemiological model. Benefit-cost ratios were ranked within each incursion scenario to evaluate alternative strategies. In all incursion scenarios, the ranking indicated that a strategy, including 100% vaccination within a PZ set at Scottish counties along the England–Scotland border yielded the least benefit in terms of the extent of avoided outbreak losses (per unit cost). The economically optimal vaccination strategy was the scenario that employed 50% vaccination and all Scotland as a PZ. The results provide an indicator of how resources can best be targeted for an efficient ex-ante control strategy.
Structural equation modelling and survey data were used to test determinants' influence on farmers' intentions towards Escherichia coli O157 on-farm control. Results suggest that farmers more likely to show willingness to spend money/time or vaccinate to control Escherichia coli O157 are those: who think farmers are most responsible for control; whose income depends more on opening farms to the public; with stronger disease control attitudes; affected by outbreaks; with better knowledge and more informed; with stronger perceptions of biosecurity measures’ practicality; using a health plan; who think farmers are the main beneficiaries of control; and whose farms are dairy rather than beef. The findings might suggest that farmers may implement on-farm controls for E. coli O157 if they identify a clear hazard and if there is greater knowledge of the safety and efficacy of the proposed controls.
The aim of the present study was to compare the effect of changing a range of biological traits on farm profit and greenhouse gas (GHG) emissions (expressed as carbon dioxide equivalent, CO2-eq.) in the UK dairy cow population. A Markov chain approach was used to describe the steady-state herd structure of the average milk-recorded UK dairy herd, as well as to estimate the CO2-eq. emissions per cow, and per kilogram of milk solids (MS). Effects of changing each herd production and fitness trait by one unit (e.g. 1 kg milk; 1% mastitis incidence) were assessed, with derived values for change in profit (economic values) being used in a multi-trait selection index. Of the traits studied, an increase in survival and reductions in milk volume, live weight, residual feed intake, somatic cell count, mastitis incidence, lameness incidence and calving interval were traits that would be both profitable and reduce CO2-eq. emissions per cow and per kg MS of a dairy herd. A multi-trait selection index was used to estimate the annual response in production and fitness traits and the economic response, with an estimate of annual profit per cow from selection on multiple traits. Milk volume, milk fat and protein yield, live weight, survival and dry matter intake were estimated to increase each year and body condition score, residual feed intake, somatic cell count, mastitis incidence, lameness incidence and calving interval were estimated to decrease, with selection on these traits estimated to result in an annual increase of 1% per year in GHG emissions per cow, but a reduction of 0·9% per unit product. Improved efficiencies of production associated with a reduction in milk volume (and increasing fat and protein content), live weight and feed intake (gross and metabolic efficiency, respectively), and increase in health, fertility and overall survival will increase farm annual profit of UK dairy systems and reduce their environmental impact.
We present the KMOS (K-band Multi-Object Spectrograph) Cluster and VIRIAL (VLT IRIFU Absorption Line) Guaranteed Time Observation (GTO) programs. KMOS provides 24 arms each feeding an integral field unit (14×14 spaxels of 0.2″ pixels) for IZ, YJ, H and K band near infrared (NIR) medium resolution spectroscopy (R ∼ 3500). Targets are selected from a 7.2′ diameter patrol field. Ultra-deep spectroscopy of ∼ 80 early-type cluster galaxies (∼ 20hr on source) and ∼ 200 (∼ 10hr on source) early-type field galaxies at 1 < z < 2 will dramatically improve the situation at z > 1 for which measurements of stellar velocity dispersions and absorption indices are limited to a few, often relatively young passively evolving galaxies (e.g. Bezanson 2013). In ESO Periods P92 and P93, 15 nights worth of data has been collected for KMOS-Clusters and 6 nights for VIRIAL: this will be supplemented with more data in upcoming semesters. All galaxies have multiband HST imaging including existing or upcoming WFC3 IR imaging, providing stellar mass maps and sizes. Combined with our dispersion measurements, this will allow us to examine the fundamental plane and the dynamical mass of a large sample of z > 1 galaxies for the first time, for both cluster and field galaxies.
KMOS is a cryogenic infrared spectrograph fed by twentyfour deployable integral field units that patrol a 7.2 arcminute diameter field of view at the Nasmyth focus of the ESO VLT. It is well suited to the study of galaxy clusters at 1 < z < 2 where the well understood features in the restframe V-band are shifted into the KMOS spectral bands. Coupled with HST imagining, KMOS offers a window on the critical epoch for galaxy evolution, 7-10 Gyrs ago, when the key properties of cluster galaxies were established. We aim to investigate the size, mass, morphology and star formation history of galaxies in the clusters. Here we describe the instrument, discuss the status of the observations and report some preliminary results.
‘Cerebral small vessel disease’ is common in older adults and is an important cause of morbidity, functional impairment and cognitive decline. Small vessel disease is a collective term used to describe a number of underlying pathological processes and neuroimaging findings, such as lacunar infarcts, white matter lesions and microhaemorrhages.
With readily available neuroimaging, diagnostic accuracy has improved; however, the management of small vessel disease and prevention of cognitive decline remains uncertain. Treatment of conventional vascular risk factors may be appropriate, but future research is required to provide definitive answers. We have conducted a comprehensive literature review of cerebral small vessel disease in older adults. This highlights the clinical sequelae and underlying pathological processes, whilst discussing novel diagnostic neuroimaging and therapeutic strategies.
The monogenean Protopolystoma xenopodis has been established in Wales for >40 years following introduction with Xenopus laevis from South Africa. This provides an experimental system for determining constraints affecting introduced species in novel environments. Parasite development post-infection was followed at 15, 20 and 25°C for 15 weeks and at 10°C for ⩾1 year and correlated with temperatures recorded in Wales. Development was slowed/arrested at ⩽10°C which reflects habitat conditions for >6 months/year. There was wide variation in growth at constant temperature (body size differing by >10 times) potentially attributable in part to genotype-specific host-parasite interactions. Parasite density had no effect on size but host sex did: worms in males were 1·8 times larger than in females. Minimum time to patency was 51 days at 25°C and 73 days at 20°C although some infections were still not patent at both temperatures by 105 days p.i. In Wales, fastest developing infections may mature within one summer (about 12 weeks), possibly accelerated by movements of hosts into warmer surface waters. Otherwise, development slows/stops in October–April, delaying patency to about 1 year p.i., while wide variation in developmental rates may impose delays of 2 years in some primary infections and even longer in secondary infections.
Factors affecting survival of parasites introduced to new geographical regions include changes in environmental temperature. Protopolystoma xenopodis is a monogenean introduced with the amphibian Xenopus laevis from South Africa to Wales (probably in the 1960s) where low water temperatures impose major constraints on life-cycle processes. Effects were quantified by maintenance of eggs from infections in Wales under controlled conditions at 10, 12, 15, 18, 20 and 25°C. The threshold for egg viability/ development was 15°C. Mean times to hatching were 22 days at 25°C, 32 days at 20°C, extending to 66 days at 15°C. Field temperature records provided calibration of transmission schedules. Although egg production continues year-round, all eggs produced during >8 months/ year die without hatching. Output contributing significantly to transmission is restricted to 10 weeks (May–mid-July). Host infection, beginning after a time lag of 8 weeks for egg development, is also restricted to 10 weeks (July–September). Habitat temperatures (mean 15·5°C in summer 2008) allow only a narrow margin for life-cycle progress: even small temperature increases, predicted with ‘global warming’, enhance infection. This system provides empirical data on the metrics of transmission permitting long-term persistence of isolated parasite populations in limiting environments.
A total of 1590 calves were investigated between May 1972 and December 1975. Twenty-two per cent were treated for respiratory disease and 2·5% died of pneumonia. Almost 80% of the respiratory illness occurred in six sharp outbreaks. Samples for virology were collected routinely from 127 healthy calves and from 354 calves treated for respiratory signs and comprised 1143 nasopharyngeal swabs and 1069 sera. Virus infections were detected on 540 occasions including 135 by parainfluenzavirus type 3 (Pi-3), 78 by respiratory syncytial virus (RSV), 103 by rhinovirus, 49 by bovine virus diarrhoea virus (BVDV), 29 by adenoviruses, 53 by reoviruses and 88 by enteroviruses. The seasonal and age distribution of infections differed between viruses. Only infections by RSV, Pi-3 and BVDV were significantly associated with disease.