To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This project will work closely with existing service partners involved in street level services and focus on testing and evaluating three approaches for street level interventions for youth who are homeless and who have severe or moderate mentally illness. Youth will be asked to choose their preferred service approach:
Housing First related initiatives focused on interventions designed to move youth to appropriate and available housing and ongoing housing supports.
Treatment First initiatives to provide Mental Health/Addiction supports and treatment solutions, and; Simultaneous attention to both Housing and Treatment Together
Our primary objective is to understand the service delivery preferences of homeless youth and understand the outcomes of these choices. Our research questions include:
1. Which approaches to service are chosen by youth?
2. What are the differences and similarities between groups choosing each approach?
3. What are the critical ingredients needed to effectively implement services for homeless youth from the perspectives of youth, families and service providers?
Focus groups with staff and family members will occur to assist in understanding the nature of each of service approach, changes that evolve within services, & facilitators and barriers to service delivery. This work will be important in determining which approach is chosen by youth and why. Evaluating the outcomes with each choice will provide valuable information about outcomes for the service options chosen by youth. This assist in better identifying weaknesses in the services offered and inform further development of treatment options that youth will accept.
This audit was undertaken to evaluate what screening is undertaken for metabolic side effects of antipsychotic drugs in patients under the care of Ceredigion Community Mental Health.
Raise awareness with Local health Board regarding metabolic screening Raise awareness regarding four aspects of metabolic syndrome.
Audit tool based on POMH (Prescribing Observatory for Mental Health) Topic 2B Audit Data was collected from medical notes, abstracted and inputted into Microsoft Excel. The sample consisted of 15 service users; 11(73%) male and 4(27%) female.
Most individuals where of white British/Irish ethnicity, (14, 93%).
Most (14, 93%) patients primary clinical psychiatric diagnosis based on the ICD10 category was F20–F29.
Six (40%) individuals in the sample smoked, of which two (33%) were offered help with smoking cessation. There was evidence of diabetes in three case notes. In one (33%) case note it was Mental Health services that uncovered the diabetes. One service users file had evidence of a known diagnosis of Hypertension. This was not discovered by Mental Health services. Three service users had evidence of a disturbed lipid profile. In two (66%) of these notes it was Mental Health services that discovered the disturbed lipid profile.
Lifestyle management pack: guidance and resources for staff and service users around diet, exercise, smoking cessation and other health lifestyle issues. Physical health check reminder cards: A patient-held card to record the results of physical health checks and due dates for new appointments.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
Determining best practices for managing free farrowing systems is crucial for uptake. Cross-fostering, the exchange of piglets between litters, is routinely performed amongst crate-housed sows. However, cross-fostering can increase fighting amongst the litter and may be more challenging within free farrowing systems as sows have more freedom to respond to cross-fostered piglets. This study compared the effect of either cross-fostering (FOS), or a control of sham-fostering (CON), of four focal piglets per litter on Day 6 postpartum in crates (CRATE) and free farrowing pens (PEN). The post-treatment behavioural responses of sows were recorded (Day 6 = 60 min; Day 7 = 300 min; n = 48), as were the average daily gain (ADG; g/day), total weight gain (TWG; kg) and body lesion scores of focal piglets and their littermates throughout lactation (Day 6, Day 8, Day 11 and Day 26; n = 539) and the post-weaning period (Day 29, Day 32 and Day 60; n = 108). On Day 6, though post-reunion latency to nursing did not differ, latency to successful nursing was longer amongst FOS than CON litters (P < 0.001), more so amongst CRATE FOS than PEN FOS (P < 0.01). On Day 7, PEN FOS sows had fewer successful nursing bouts (P < 0.05) and exhibited decreased lateral (P < 0.01) and increased ventral lying frequencies (P < 0.01) compared to all other housing and treatment combinations. Focal piglet ADG was lower for FOS than CON in the CRATE during Day 6 to Day 8 (P < 0.01) and lower in the PEN during Day 6 to Day 8 (P < 0.001), Day 8 to Day 11 (P < 0.01) and Day 11 to Day 26 (P < 0.05). The TWG of pre-weaned focal piglets (Day 6 to Day 26) was higher amongst CON than FOS litters (P = 0.01). Post-weaning, piglet ADG was higher for PEN than CRATE during Day 26 to Day 29 (P < 0.01) and higher for FOS than CON during Day 26 to Day 29 (P < 0.05), Day 29 to Day 32 (P < 0.001) and Day 32 to Day 60 (P < 0.01); thus, TWG was higher for FOS than CON during the weaner (P = 0.001) and the combined lactation and weaner periods (P = 0.09). In conclusion, sow behaviour was disrupted by cross-fostering in the crates and pens and continued to be disturbed on the following day amongst penned sows. FOS piglets exhibited reduced ADG after cross-fostering, which extended throughout lactation in the pens. However, the increased post-weaning weight gain of FOS piglets meant that their TWG was higher than CON piglets, irrespective of the farrowing system used.
Surgical site infections (SSIs) are among the most common healthcare-associated infections in low- and middle-income countries. To encourage establishment of actionable and standardized SSI surveillance in these countries, we propose simplified surveillance case definitions. Here, we use NHSN reports to explore concordance of these simplified definitions to NHSN as ‘reference standard.’
The prevalence of many diseases in pigs displays seasonal distributions. Despite growing concerns about the impacts of climate change, we do not yet have a good understanding of the role that weather factors play in explaining such seasonal patterns. In this study, national and county-level aggregated abattoir inspection data were assessed for England and Wales during 2010–2015. Seasonally-adjusted relationships were characterised between weekly ambient maximum temperature and the prevalence of both respiratory conditions and tail biting detected at slaughter. The prevalence of respiratory conditions showed cyclical annual patterns with peaks in the summer months and troughs in the winter months each year. However, there were no obvious associations with either high or low temperatures. The prevalence of tail biting generally increased as temperatures decreased, but associations were not supported by statistical evidence: across all counties there was a relative risk of 1.028 (95% CI 0.776–1.363) for every 1 °C fall in temperature. Whilst the seasonal patterns observed in this study are similar to those reported in previous studies, the lack of statistical evidence for an explicit association with ambient temperature may possibly be explained by the lack of information on date of disease onset. There is also the possibility that other time-varying factors not investigated here may be driving some of the seasonal patterns.
During three decades, only about 20 new drugs have been developed for malaria, tuberculosis and all neglected tropical diseases (NTDs). This critical situation was reached because NTDs represent only 10% of health research investments; however, they comprise about 90% of the global disease burden. Computational simulations applied in virtual screening (VS) strategies are very efficient tools to identify pharmacologically active compounds or new indications for drugs already administered for other diseases. One of the advantages of this approach is the low time-consuming and low-budget first stage, which filters for testing experimentally a group of candidate compounds with high chances of binding to the target and present trypanocidal activity. In this work, we review the most common VS strategies that have been used for the identification of new drugs with special emphasis on those applied to trypanosomiasis and leishmaniasis. Computational simulations based on the selected protein targets or their ligands are explained, including the method selection criteria, examples of successful VS campaigns applied to NTDs, a list of validated molecular targets for drug development and repositioned drugs for trypanosomatid-caused diseases. Thereby, here we present the state-of-the-art of VS and drug repurposing to conclude pointing out the future perspectives in the field.
Prevention of Clostridioides difficile infection (CDI) is a national priority and may be facilitated by deployment of the Targeted Assessment for Prevention (TAP) Strategy, a quality improvement framework providing a focused approach to infection prevention. This article describes the process and outcomes of TAP Strategy implementation for CDI prevention in a healthcare system.
Hospital A was identified based on CDI surveillance data indicating an excess burden of infections above the national goal; hospitals B and C participated as part of systemwide deployment. TAP facility assessments were administered to staff to identify infection control gaps and inform CDI prevention interventions. Retrospective analysis was performed using negative-binomial, interrupted time series (ITS) regression to assess overall effect of targeted CDI prevention efforts. Analysis included hospital-onset, laboratory-identified C. difficile event data for 18 months before and after implementation of the TAP facility assessments.
The systemwide monthly CDI rate significantly decreased at the intervention (β2, −44%; P = .017), and the postintervention CDI rate trend showed a sustained decrease (β1 + β3; −12% per month; P = .008). At an individual hospital level, the CDI rate trend significantly decreased in the postintervention period at hospital A only (β1 + β3, −26% per month; P = .003).
This project demonstrates TAP Strategy implementation in a healthcare system, yielding significant decrease in the laboratory-identified C. difficile rate trend in the postintervention period at the system level and in hospital A. This project highlights the potential benefit of directing prevention efforts to facilities with the highest burden of excess infections to more efficiently reduce CDI rates.
This study investigated whether individual differences in receptive vocabulary, speech perception and production, and nonword repetition at age 2 years, 4 months to 3 years, 4 months predicted phonological awareness 2 years later. One hundred twenty-one children were tested twice. During the first testing period (Time 1), children’s receptive vocabulary, speech perception and production, and nonword repetition were measured. Nonword repetition accuracy in the present study was distinct from other widely used measures of nonword repetition in that it focused on narrow transcription of diphone sequences in each nonword that differed systematically in phonotactic probability. At the second testing period (Time 2), children’s phonological awareness was measured. The best predictors of phonological awareness were a measure of speech production and a measure of phonological processing derived from performance on the nonword repetition task. The results of this study suggest that nonword repetition accuracy provides an implicit measure of phonological skills that are indicative of later phonological awareness at an age when children are too young to perform explicit phonological awareness tasks reliably.
To describe pathogen distribution and rates for central-line–associated bloodstream infections (CLABSIs) from different acute-care locations during 2011–2017 to inform prevention efforts.
CLABSI data from the Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network (NHSN) were analyzed. Percentages and pooled mean incidence density rates were calculated for a variety of pathogens and stratified by acute-care location groups (adult intensive care units [ICUs], pediatric ICUs [PICUs], adult wards, pediatric wards, and oncology wards).
From 2011 to 2017, 136,264 CLABSIs were reported to the NHSN by adult and pediatric acute-care locations; adult ICUs and wards reported the most CLABSIs: 59,461 (44%) and 40,763 (30%), respectively. In 2017, the most common pathogens were Candida spp/yeast in adult ICUs (27%) and Enterobacteriaceae in adult wards, pediatric wards, oncology wards, and PICUs (23%–31%). Most pathogen-specific CLABSI rates decreased over time, excepting Candida spp/yeast in adult ICUs and Enterobacteriaceae in oncology wards, which increased, and Staphylococcus aureus rates in pediatric locations, which did not change.
The pathogens associated with CLABSIs differ across acute-care location groups. Learning how pathogen-targeted prevention efforts could augment current prevention strategies, such as strategies aimed at preventing Candida spp/yeast and Enterobacteriaceae CLABSIs, might further reduce national rates.
The duodenum lies in front of the right kidney and renal vessels, the right psoas muscle, the inferior vena cava, and the aorta (Figure 26.1).
The duodenum is approximately 25 cm in length. It is the most fixed part of the small intestine and has no mesentery. It is anatomically divided into four parts:
The superior or first portion is intraperitoneal along the anterior half of its circumference. Superiorly, the first portion is attached to the hepatoduodenal ligament. The posterior wall is associated with the gastroduodenal artery, common bile duct, and the portal vein.
The descending or second portion shares a medial border with the head of the pancreas. It is bordered posteriorly by the medial surface of the right kidney, the right renal vessels, and the inferior vena cava. The hepatic flexure and transverse colon cross anteriorly. The common bile duct and main pancreatic duct drain into the medial wall of the descending duodenum.
The transverse or third portion is also entirely retroperitoneal. Posteriorly, it is bordered by the inferior vena cava and the aorta. The superior mesenteric vessels cross in front of this portion of the duodenum.
The ascending or fourth portion of the duodenum is approximately 2.5 cm in length and is primarily retroperitoneal, except for the most distal segment. It crosses anterior to and ascends to the left of the aorta to join the jejunum at the ligament of Treitz.
The common bile duct courses laterally within the hepatodudenal ligament and lies posterior to the first portion of the duodenum and pancreatic head, becoming partially invested within the parenchyma of the pancreatic head. The main pancreatic duct then joins the common bile duct to drain into the ampulla of Vater within the second portion of the duodenum. The ampulla of Vater is located approximately 7 cm from the pylorus. The accessory pancreatic duct drains approximately 2 cm proximal to the ampulla of Vater.
The vascular supply to the duodenum is intimately associated with the head of the pancreas. The head of the pancreas and the second portion of the duodenum derive their blood supply from the anterior and posterior pancreaticoduodenal arcades (Figure 26.2). These arcades lie on the surface of the pancreas near the duodenal C loop. Attempts to separate these two organs at this location usually results in ischemia of the duodenum.
Considerable progress in explaining cultural evolutionary dynamics has been made by applying rigorous models from the natural sciences to historical and ethnographic information collected and accessed using novel digital platforms. Initial results have clarified several long-standing debates in cultural evolutionary studies, such as population origins, the role of religion in the evolution of complex societies and the factors that shape global patterns of language diversity. However, future progress requires recognition of the unique challenges posed by cultural data. To address these challenges, standards for data collection, organisation and analysis must be improved and widely adopted. Here, we describe some major challenges to progress in the construction of large comparative databases of cultural history, including recognising the critical role of theory, selecting appropriate units of analysis, data gathering and sampling strategies, winning expert buy-in, achieving reliability and reproducibility in coding, and ensuring interoperability and sustainability of the resulting databases. We conclude by proposing a set of practical guidelines to meet these challenges.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
Sulfur-bearing monazite-(Ce) occurs in silicified carbonatite at Eureka, Namibia, forming rims up to ~0.5 mm thick on earlier-formed monazite-(Ce) megacrysts. We present X-ray photoelectron spectroscopy data demonstrating that sulfur is accommodated predominantly in monazite-(Ce) as sulfate, via a clino-anhydrite-type coupled substitution mechanism. Minor sulfide and sulfite peaks in the X-ray photoelectron spectra, however, also indicate that more complex substitution mechanisms incorporating S2– and S4+ are possible. Incorporation of S6+ through clino-anhydrite-type substitution results in an excess of M2+ cations, which previous workers have suggested is accommodated by auxiliary substitution of OH– for O2–. However, Raman data show no indication of OH–, and instead we suggest charge imbalance is accommodated through F– substituting for O2–. The accommodation of S in the monazite-(Ce) results in considerable structural distortion that may account for relatively high contents of ions with radii beyond those normally found in monazite-(Ce), such as the heavy rare earth elements, Mo, Zr and V. In contrast to S-bearing monazite-(Ce) in other carbonatites, S-bearing monazite-(Ce) at Eureka formed via a dissolution–precipitation mechanism during prolonged weathering, with S derived from an aeolian source. While large S-bearing monazite-(Ce) grains are likely to be rare in the geological record, formation of secondary S-bearing monazite-(Ce) in these conditions may be a feasible mineral for dating palaeo-weathering horizons.
This paper summarizes a multi-state, multi-year study assessing the potential for local agriculture in northern New England. While largely rural, this region's agricultural sector differs greatly from the rest of the United States, and demand for locally produced food has been increasing. To assess this unique economic landscape, researchers and Cooperative Extension at the Universities of Maine, New Hampshire, and Vermont investigated four key areas: (1) local food capacities, (2) constraints to agricultural expansion, (3) consumer preferences for local and organic produce, and (4) the role of intermediaries as alternative local food outlets. The project included input from local farmers, Extension members, restaurants, and the general public. We present the four research areas in a sequential, overlapping fashion. The timing of our research was such that each step in the process informed the next and can be used as a template for assessing a region's potential for local agricultural production.
The introduction of agriculture is a key defining element of the Neolithic, yet considerable debate persists concerning the nature and significance of early farming practices in north-west Europe. This paper reviews archaeobotanical evidence from 95 Neolithic sites (c. 4000–2200 cal bc) in Wales, focusing on wild plant exploitation, the range of crops present, and the significance of cereals in subsistence practices. Cereal cultivation practices in Early Neolithic Wales are also examined using cereal grain stable carbon (δ13C) and nitrogen (δ15N) isotope analysis. The Early Neolithic period witnessed the widespread uptake of cereals alongside considerable evidence for continued wild plant exploitation, notably hazelnuts and wild fruits. The possibility that wild plants and woodlands were deliberately managed or altered to promote the growth of certain plants is outlined. Small cereal grain assemblages, with little evidence for chaff and weed seeds, are common in the Early Neolithic, whereas cereal-rich sites are rare. Emmer wheat was the dominant crop in the Early Neolithic, while other cereal types were recorded in small quantities. Cereal nitrogen isotope (δ15N) values from Early Neolithic sites provided little evidence for intensive manuring. We suggest that cultivation conditions may have been less intensive when compared to other areas of Britain and Europe. In the later Neolithic period, there is evidence for a decline in the importance of cereals. Finally, the archaeobotanical and crop isotope data from this study are considered within a wider European context.
Thermal infrared data collected by the Thermal Emission Spectrometer (TES) and Thermal Emission Imaging System (THEMIS) instruments have significantly impacted the understanding of martian surface mineralogy. Spatial/temporal variations in igneous lithologies; the discovery of quartz, carbonates, and chlorides; and the widespread identification of amorphous, silica-enriched materials reveal a planet that has experienced a diversity of primary and secondary geo-logic processes including igneous crustal evolution, regional sedimentation, aqueous alteration, and glacial/periglacial activity.