To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Classical stewardship efforts have targeted immunocompetent patients; however, appropriate use of antimicrobials in the immunocompromised host has become a target of interest. Cytomegalovirus (CMV) infection is one of the most common and significant complications after solid-organ transplant (SOT). The treatment of CMV requires a dual approach of antiviral drug therapy and reduction of immunosuppression for optimal outcomes. This dual approach to CMV management increases complexity and requires individualization of therapy to balance antiviral efficacy with the risk of allograft rejection. In this review, we focus on the development and implementation of CMV stewardship initiatives, as a component of antimicrobial stewardship in the immunocompromised host, to optimize the management of prevention and treatment of CMV in SOT recipients. These initiatives have the potential not only to improve judicious use of antivirals and prevent resistance but also to improve patient and graft survival given the interconnection between CMV infection and allograft function.
As the climate changes and ecosystems shift toward novel combinations of species, the methods and metrics of conservation science are becoming less species-centric. To meet this growing need, marine conservation paleobiologists stand to benefit from the addition of new, taxon-free benthic indices to the live–dead analysis tool kit. These indices, which were developed to provide actionable, policy-specific data, can be applied to the readily preservable component of benthic communities (e.g., mollusks) to assess the ecological quality status of the entire community. Because these indices are taxon-free, they remain applicable even as the climate changes and novel communities develop—making them a potentially valuable complement to traditionally applied approaches for live–dead analysis, which tend to focus on maintaining specific combinations of species under relatively stable environmental conditions. Integrating geohistorical data with these established indices has potential to increase the salience of the live–dead approach in the eyes of resource managers and other stakeholders.
Commercialization of crops resistant to application of dicamba is a cause of major concern for sweetpotato producers regarding potential negative impacts due to herbicide drift or sprayer contamination events. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of BAPMA or DGA salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1000 of the 1x use rate of each dicamba formulation at 0.56 kg ha-1, glyphosate at 1.12 kg ha-1, and the combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial or storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) observed with increase herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, this relationship as well as the significance of herbicide rate was not observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation stage with a few exceptions. In general, crop injury and yield reduction was greatest at the highest rate (1/10x) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of no.1 and marketable grades was observed following 1/250, 1/100, or 1/10x application rate of dicamba alone or with glyphosate when applied at storage root development.
Introduction: The legalization of cannabis for recreational use in 2018 remains a controversial topic. There are multiple perceived benefits of cannabis including pain relief, treatment of epilepsy syndromes, and improving body weight of cancer patients. However, there are also many potential risks. The short-term health consequences include cannabinoid hyperemesis syndrome and cannabis induced psychosis. These conditions directly impact the influx of patients presenting to Emergency Departments (ED). There is currently limited research in the area of cannabis legalization burden. However, the studies performed have shown a significant impact in those states which cannabis is legal. A study completed in Colorado found that hospitalization rates with marijuana related billing codes increased from 274 to 593 per 100 000 hospitalizations after the state legalization of recreational cannabis. This study aims to examine if Canada's hospitals are experiencing the same burden as other jurisdictions. Methods: A descriptive study was preformed via a retrospective chart review of cannabis related visits in tertiary EDs in St. John's, NL, from six months prior to the date of legalization of cannabis for recreational use, to six months after. Hospital ED visit records from both the Health Science Centre and St. Clare's Mercy Hospital were searched using keywords to identify patients who presented with symptoms related to cannabis use. We manually reviewed all visit records that included one or more of these terms to distinguish true positives from false positive cases, unrelated to cannabis use. Results: A total of 287 charts were included in the study; 123 visits were related to cannabis use six months prior to legalization, and 164 six months after legalization. A significant increase in ED visits following the legalization of recreational cannabis was seen (p < .001). There was no significant difference in the age of users between the two groups. Additionally, the number one presenting complaint due to cannabis use was vomiting (47.7%), followed by anxiety (12.2%). Conclusion: Following the implementation of the Cannabis Act in Canada, EDs in St. John's, NL had a statistically significant increase in the number of visits related to cannabis use. It is important to determine such consequences to ensure hospitals and public health agencies are prepared to treat the influx of visits and are better equipped to manage the associated symptoms.
Relatively few studies have assessed the prevalence, correlates, and independent impact on quality of life (QoL) of trichotillomania (TTM) in large samples.
Consecutive participants (N = 7639) were recruited from a cross-sectional web-based study. Sociodemographic data were collected and several validated self-reported mental health measures were completed (Minnesota Impulsive Disorders Interview, Hypomania checklist, Fagerström Test for Nicotine Dependence, Alcohol Use Disorders Identification Test, Early Trauma Inventory Self Report–Short Form, and the Symptom Checklist-90–Revised Inventory). Health-related QoL was assessed with the World Health Organization QoL abbreviated scale (WHOQOL-Bref). Multivariable models adjusted associations to potential confounders.
The sample was predominantly composed of young females (71.3%; mean age: 27.2 ± 7.9 years). The prevalence of probable TTM was 1.4% (95% confidence intervals [CI]: 1.2-1.7), and was more common among females. Participants with probable TTM had a greater likelihood of having co-occurring probable depression (adjusted odds ratio [ORadj] = 1.744; 95% CI: 1.187-2.560), tobacco (ORadj = 2.250; 95% CI: 1.191-4.250), and alcohol (ORadj = 1.751; 95% CI: 1.169-2.621) use disorders. Probable TTM was also independently associated with suicidal ideation (ORadj = 1.917; 95% CI: 1.224-3.003) and exposure to childhood sexual abuse (ORadj = 1.221; 95% CI: 1.098-1.358). In addition, a positive screen for TTM had more impaired physical and mental QoL.
TTM was associated with a positive screen for several psychiatric comorbidities as well as impaired physical and psychological QoL. Efforts towards the recognition and treatment of TTM across psycho-dermatology services are warranted.
Convolutional neural networks are a subclass of deep learning or artificial intelligence that are predominantly used for image analysis and classification. This proof-of-concept study attempts to train a convolutional neural network algorithm that can reliably determine if the middle turbinate is pneumatised (concha bullosa) on coronal sinus computed tomography images.
Consecutive high-resolution computed tomography scans of the paranasal sinuses were retrospectively collected between January 2016 and December 2018 at a tertiary rhinology hospital in Australia. The classification layer of Inception-V3 was retrained in Python using a transfer learning method to interpret the computed tomography images. Segmentation analysis was also performed in an attempt to increase diagnostic accuracy.
The trained convolutional neural network was found to have diagnostic accuracy of 81 per cent (95 per cent confidence interval: 73.0–89.0 per cent) with an area under the curve of 0.93.
A trained convolutional neural network algorithm appears to successfully identify pneumatisation of the middle turbinate with high accuracy. Further studies can be pursued to test its ability in other clinically important anatomical variants in otolaryngology and rhinology.
The beginning of laminar–turbulent transition is usually associated with a wave-like disturbance, but its evolution and role in precipitating the development of other flow structures are not well understood from a structure-based view. Nonlinear parabolized stability equations (NPSE) were solved numerically to simulate the transition of K-regime, N-regime and O-regime. However, only the K-regime transition was examined experimentally using both hydrogen bubble visualization and time-resolved tomographic particle image velocimetry (tomo-PIV). Based on the ‘NPSE visualization’ and ‘tomographic visualization’, at least four common characteristics of the generic transition process were identified: (i) inflectional regions representing high-shear layers (HSL) that develop in vertical velocity profiles, accompanied by ejection–sweep behaviours; (ii) low-speed streak (LSS) patterns, manifested in horizontal timelines, that seem to consist of several three-dimensional (3-D) waves; (iii) a warped wave front (WWF) pattern, displaying multiple folding processes, which develops adjacent to the LSS in the near-wall region, prior to the appearance of 𝛬-vortices; (iv) a coherent 3-D wave front, similar to a soliton, in the upper boundary layer, accompanied by regions of depression along the flanks of the wave. It was determined that the amplification and lift-up of a 3-D wave causes the development of the HSL, WWF and multiple folding behaviour of material surfaces, that all contribute to the development of a 𝛬-vortex. The amplified 3-D wave is hypothesized as a soliton-like coherent structure. Based on our results, a path to transition is proposed, which hypothesizes the function of the WWF in boundary-layer transition.
Exposure to glucocorticoid levels higher than appropriate for current developmental stages induces offspring metabolic dysfunction. Overfed/obese (OB) ewes and their fetuses display elevated blood cortisol, while fetal Adrenocorticotropic hormone (ACTH) remains unchanged. We hypothesized that OB pregnancies would show increased placental 11β hydroxysteroid dehydrogenase 2 (11β-HSD2) that converts maternal cortisol to fetal cortisone as it crosses the placenta and increased 11β-HSD system components responsible for peripheral tissue cortisol production, providing a mechanism for ACTH-independent increase in circulating fetal cortisol. Control ewes ate 100% National Research Council recommendations (CON) and OB ewes ate 150% CON diet from 60 days before conception until necropsy at day 135 gestation. At necropsy, maternal jugular and umbilical venous blood, fetal liver, perirenal fat, and cotyledonary tissues were harvested. Maternal plasma cortisol and fetal cortisol and cortisone were measured. Fetal liver, perirenal fat, cotyledonary 11β-HSD1, hexose-6-phosphate dehydrogenase (H6PD), and 11β-HSD2 protein abundance were determined by Western blot. Maternal plasma cortisol, fetal plasma cortisol, and cortisone were higher in OB vs. CON (p < 0.01). 11β-HSD2 protein was greater (p < 0.05) in OB cotyledonary tissue than CON. 11β-HSD1 abundance increased (p < 0.05) in OB vs. CON fetal liver and perirenal fat. Fetal H6PD, an 11β-HSD1 cofactor, also increased (p < 0.05) in OB vs. CON perirenal fat and tended to be elevated in OB liver (p < 0.10). Our data provide evidence for increased 11β-HSD system components responsible for peripheral tissue cortisol production in fetal liver and adipose tissue, thereby providing a mechanism for an ACTH-independent increase in circulating fetal cortisol in OB fetuses.
Over half of individuals with eating disorders experience suicidal ideation at some point in their lives, yet few longitudinal studies have examined predictors of ideation in this at-risk group. Moreover, prospective research has focused on relatively distal or trait-level factors that are informative for distinguishing who is most at risk but not when. Little is known about more proximal or state-level risk factors that fluctuate within an individual, which is critical for determining when a person is most likely to engage in suicidal behaviors.
Women (N = 97) receiving treatment for their eating disorder completed questionnaires weekly to assess suicidal ideation and interpersonal constructs (i.e. perceived burdensomeness, thwarted belongingness) theorized to be proximal predictors of suicidal desire. Longitudinal multilevel models were conducted to examine both within- and between-person predictors of suicidal ideation across 12 weeks of treatment.
Statistically significant within-person effects for burdensomeness (β = 0.06; p < 0.001) indicate that when individuals have greater feelings of burdensomeness compared to their own average, they also experience higher suicidal ideation. We did not find any significant influence of thwarted belongingness or the interaction between burdensomeness and belongingness on suicidal ideation.
This study was the first to examine dynamic associations between interpersonal constructs and suicidal ideation in individuals with eating disorders. Results are only partially consistent with the Interpersonal Theory of Suicide and suggest that short-term changes in burdensomeness may impact suicidal behavior in individuals with eating disorders.
The prevalence of many diseases in pigs displays seasonal distributions. Despite growing concerns about the impacts of climate change, we do not yet have a good understanding of the role that weather factors play in explaining such seasonal patterns. In this study, national and county-level aggregated abattoir inspection data were assessed for England and Wales during 2010–2015. Seasonally-adjusted relationships were characterised between weekly ambient maximum temperature and the prevalence of both respiratory conditions and tail biting detected at slaughter. The prevalence of respiratory conditions showed cyclical annual patterns with peaks in the summer months and troughs in the winter months each year. However, there were no obvious associations with either high or low temperatures. The prevalence of tail biting generally increased as temperatures decreased, but associations were not supported by statistical evidence: across all counties there was a relative risk of 1.028 (95% CI 0.776–1.363) for every 1 °C fall in temperature. Whilst the seasonal patterns observed in this study are similar to those reported in previous studies, the lack of statistical evidence for an explicit association with ambient temperature may possibly be explained by the lack of information on date of disease onset. There is also the possibility that other time-varying factors not investigated here may be driving some of the seasonal patterns.
Recent work suggests that antihypertensive medications may be useful as repurposed treatments for mood disorders. Using large-scale linked healthcare data we investigated whether certain classes of antihypertensive, such as angiotensin antagonists (AAs) and calcium channel blockers, were associated with reduced risk of new-onset major depressive disorder (MDD) or bipolar disorder (BD).
Two cohorts of patients treated with antihypertensives were identified from Scottish prescribing (2009–2016) and hospital admission (1981–2016) records. Eligibility for cohort membership was determined by a receipt of a minimum of four prescriptions for antihypertensives within a 12-month window. One treatment cohort (n = 538 730) included patients with no previous history of mood disorder, whereas the other (n = 262 278) included those who did. Both cohorts were matched by age, sex and area deprivation to untreated comparators. Associations between antihypertensive treatment and new-onset MDD or bipolar episodes were investigated using Cox regression.
For patients without a history of mood disorder, antihypertensives were associated with increased risk of new-onset MDD. For AA monotherapy, the hazard ratio (HR) for new-onset MDD was 1.17 (95% CI 1.04–1.31). Beta blockers' association was stronger (HR 2.68; 95% CI 2.45–2.92), possibly indicating pre-existing anxiety. Some classes of antihypertensive were associated with protection against BD, particularly AAs (HR 0.46; 95% CI 0.30–0.70). For patients with a past history of mood disorders, all classes of antihypertensives were associated with increased risk of future episodes of MDD.
There was no evidence that antihypertensive medications prevented new episodes of MDD but AAs may represent a novel treatment avenue for BD.
Rules of thumb (RoTs) are proposed as a means of promoting higher levels of Defined Contribution (DC) pension saving and to help stimulate debate about the high and uncertain cost of pension provision, leading to the development of solutions. The Lifetime Pension Contribution (LPC) tells young people what pension contribution is required over a full working life to achieve a decent retirement income, calculated as 23% of average UK earnings. Another RoT is that each 1% of earnings provides a pension of 1.5% of earnings. Other RoTs show how costs vary by retirement age and if the saverʼs retirement planning is on track. The current high cost of pensions is partly due to low interest rates and the inefficiencies of the DC market, with inadequate bulk purchasing power and risk sharing. RoTs might help encourage higher employer contributions, either through automatic enrolment or on a voluntary basis.
Sulfur-bearing monazite-(Ce) occurs in silicified carbonatite at Eureka, Namibia, forming rims up to ~0.5 mm thick on earlier-formed monazite-(Ce) megacrysts. We present X-ray photoelectron spectroscopy data demonstrating that sulfur is accommodated predominantly in monazite-(Ce) as sulfate, via a clino-anhydrite-type coupled substitution mechanism. Minor sulfide and sulfite peaks in the X-ray photoelectron spectra, however, also indicate that more complex substitution mechanisms incorporating S2– and S4+ are possible. Incorporation of S6+ through clino-anhydrite-type substitution results in an excess of M2+ cations, which previous workers have suggested is accommodated by auxiliary substitution of OH– for O2–. However, Raman data show no indication of OH–, and instead we suggest charge imbalance is accommodated through F– substituting for O2–. The accommodation of S in the monazite-(Ce) results in considerable structural distortion that may account for relatively high contents of ions with radii beyond those normally found in monazite-(Ce), such as the heavy rare earth elements, Mo, Zr and V. In contrast to S-bearing monazite-(Ce) in other carbonatites, S-bearing monazite-(Ce) at Eureka formed via a dissolution–precipitation mechanism during prolonged weathering, with S derived from an aeolian source. While large S-bearing monazite-(Ce) grains are likely to be rare in the geological record, formation of secondary S-bearing monazite-(Ce) in these conditions may be a feasible mineral for dating palaeo-weathering horizons.
The Late Formative period immediately precedes the emergence of Tiwanaku, one of the earliest South American states, yet it is one of the most poorly understood periods in the southern Lake Titicaca Basin (Bolivia). In this article, we refine the ceramic chronology of this period with large sets of dates from eight sites, focusing on temporal inflection points in decorated ceramic styles. These points, estimated here by Bayesian models, index specific moments of change: (1) cal AD 120 (60–170, 95% probability): the first deposition of Kalasasaya red-rimmed and zonally incised styles; (2) cal AD 240 (190–340, 95% probability): a tentative estimate of the final deposition of Kalasasaya zonally incised vessels; (3) cal AD 420 (380–470, 95% probability): the final deposition of Kalasasaya red-rimmed vessels; and (4) cal AD 590 (500–660, 95% probability): the first deposition of Tiwanaku Redwares. These four modeled boundaries anchor an updated Late Formative chronology, which includes the Initial Late Formative phase, a newly identified decorative hiatus between the Middle and Late Formative periods. The models place Qeya and transitional vessels between inflection points 3 and 4 based on regionally consistent stratigraphic sequences. This more precise chronology will enable researchers to explore the trajectories of other contemporary shifts during this crucial period in Lake Titicaca Basin's prehistory.
This article attempts to identify the main ‘above-ground’ factors which impact on the contribution that geothermal energy can make to the Dutch Energy Transition, and to draw conclusions about these factors. Recent literature sources are used to illustrate the size of Dutch heating demand, and the part of this which can be provided by geothermal energy. Consideration is given to the impact of off-take variability over time, showing that the base-load nature of geothermal doublets acts as a restraint on the share which they can take in the energy supply. The characteristics of district heating grids are discussed. Other potential sources of heat are considered and compared.
The conclusion is that geothermal energy can provide a material contribution to the energy transition. This depends to a large extent on the existence of and design choices made for the development of district heating networks. Large size and standardisation, and the development of seasonal heat storage, are beneficial.
Unlike most other renewable sources of heat, which have alternative ‘premium’ applications such as the provision of ‘peak capacity’ or molecules for feedstock, geothermal energy is not suitable for other uses. The emission savings that it can provide will be lost if other heat sources are chosen in preference as supply for district heating, so that it makes sense that district heating infrastructure should be designed to encourage the use of geothermal energy where possible.
We have detected 27 new supernova remnants (SNRs) using a new data release of the GLEAM survey from the Murchison Widefield Array telescope, including the lowest surface brightness SNR ever detected, G 0.1 – 9.7. Our method uses spectral fitting to the radio continuum to derive spectral indices for 26/27 candidates, and our low-frequency observations probe a steeper spectrum population than previously discovered. None of the candidates have coincident WISE mid-IR emission, further showing that the emission is non-thermal. Using pulsar associations we derive physical properties for six candidate SNRs, finding G 0.1 – 9.7 may be younger than 10 kyr. Sixty per cent of the candidates subtend areas larger than 0.2 deg2 on the sky, compared to < 25% of previously detected SNRs. We also make the first detection of two SNRs in the Galactic longitude range 220°–240°.
This work makes available a further
of the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey, covering half of the accessible galactic plane, across 20 frequency bands sampling 72–231 MHz, with resolution
. Unlike previous GLEAM data releases, we used multi-scale CLEAN to better deconvolve large-scale galactic structure. For the galactic longitude ranges
$345^\circ < l < 67^\circ$
$180^\circ < l < 240^\circ$
, we provide a compact source catalogue of 22 037 components selected from a 60-MHz bandwidth image centred at 200 MHz, with RMS noise
and position accuracy better than 2 arcsec. The catalogue has a completeness of 50% at
, and a reliability of 99.86%. It covers galactic latitudes
towards the galactic centre and
for other regions, and is available from Vizier; images covering
for all longitudes are made available on the GLEAM Virtual Observatory (VO).server and SkyView.
We examined the latest data release from the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey covering 345° < l < 60° and 180° < l < 240°, using these data and that of the Widefield Infrared Survey Explorer to follow up proposed candidate Supernova Remnant (SNR) from other sources. Of the 101 candidates proposed in the region, we are able to definitively confirm ten as SNRs, tentatively confirm two as SNRs, and reclassify five as H ii regions. A further two are detectable in our images but difficult to classify; the remaining 82 are undetectable in these data. We also investigated the 18 unclassified Multi-Array Galactic Plane Imaging Survey (MAGPIS) candidate SNRs, newly confirming three as SNRs, reclassifying two as H ii regions, and exploring the unusual spectra and morphology of two others.
That almost half the Northern electorate continued to vote for Democrats is one of the worst understood aspects of the Civil War experience. In too many accounts of the war, Northern Democrats either do not figure at all, or do so only as morally blind obstructionists on the wrong side of history. Yet there is a case for saying that rather than being peripheral to the narrative of the war, Northern Democrats should be center stage. Because the route to Confederate victory lay in convincing the North that the cost of coercion was too high to be worth paying, the views and actions of that large and fluctuating group of white Northerners who had never joined the Republican bandwagon was crucial.