To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We examine the environmental impacts of a cash transfer program in rural Zambia and investigate whether variation in market access is associated with heterogeneous impacts on natural resource use. We consider households’ use of firewood, charcoal, bushmeat and land for farming, as well as their ownership of non-farm businesses. We find that cash transfers increase the likelihood of charcoal consumption as well as the amount consumed for those living close to paved roads. The transfers also enable households to increase the size of their farms and establish non-farm businesses. These impacts are most pronounced for those living far from paved roads. While remoteness is associated with farm expansion in response to the cash transfer, more education causes those receiving the transfer to decrease the size of their farms. This impact heterogeneity has important implications for sustainable development.
Cognitive deficits in depressed adults may reflect impaired decision-making. To investigate this possibility, we analyzed data from unmedicated adults with Major Depressive Disorder (MDD) and healthy controls as they performed a probabilistic reward task. The Hierarchical Drift Diffusion Model (HDDM) was used to quantify decision-making mechanisms recruited by the task, to determine if any such mechanism was disrupted by depression.
Data came from two samples (Study 1: 258 MDD, 36 controls; Study 2: 23 MDD, 25 controls). On each trial, participants indicated which of two similar stimuli was presented; correct identifications were rewarded. Quantile-probability plots and the HDDM quantified the impact of MDD on response times (RT), speed of evidence accumulation (drift rate), and the width of decision thresholds, among other parameters.
RTs were more positively skewed in depressed v. healthy adults, and the HDDM revealed that drift rates were reduced—and decision thresholds were wider—in the MDD groups. This pattern suggests that depressed adults accumulated the evidence needed to make decisions more slowly than controls did.
Depressed adults responded slower than controls in both studies, and poorer performance led the MDD group to receive fewer rewards than controls in Study 1. These results did not reflect a sensorimotor deficit but were instead due to sluggish evidence accumulation. Thus, slowed decision-making—not slowed perception or response execution—caused the performance deficit in MDD. If these results generalize to other tasks, they may help explain the broad cognitive deficits seen in depression.
To date, there are no recent studies identifying the prevalence of parasites of human and veterinary importance in dogs and cats in Ireland. The interaction between pets and wildlife species in the environment is an important source of parasite exposure to canids and felines, and one likely to be heightened in the stray animal population. This study aimed to establish the prevalence of endoparasites in unowned dogs and cats in County Dublin, Ireland. Feces from stray dogs (n = 627) and cats (n = 289) entering a rehoming centre were collected immediately after defecation. The main parasitic agents detected were ascarids (15.52 and 30.26%), Cystoisospora (3.27 and 3.69%), Giardia spp. (6.02 and 1.84%) and lungworms (0.64 and 2.08%), in dogs and cats respectively. Animals younger than 3 months of age were more likely to be infected with ascarids (P < 0.001) and Cystoisospora spp. (P = 0.008 and P = 0.014) than older animals. All lungworms were morphologically identified and dogs were infected with Angiostrongylus vasorum (0.48%) and Crenosoma vulpis (0.16%) whereas cats were only infected with Aelurostrongylus abstrusus (2.08%). This represents the first prevalence study of stray animals in Ireland. Data collected will inform the treatment and in addition, the future monitoring and control studies of parasite populations.
Understanding how critical sow live-weight and back-fat depth during gestation are in ensuring optimum sow productivity is important. The objective of this study was to quantify the association between sow parity, live-weight and back-fat depth during gestation with subsequent sow reproductive performance. Records of 1058 sows and 13 827 piglets from 10 trials on two research farms between the years 2005 and 2015 were analysed. Sows ranged from parity 1 to 6 with the number of sows per parity distributed as follows: 232, 277, 180, 131, 132 and 106, respectively. Variables that were analysed included total born (TB), born alive (BA), piglet birth weight (BtWT), pre-weaning mortality (PWM), piglet wean weight (WnWT), number of piglets weaned (Wn), wean to service interval (WSI), piglets born alive in subsequent farrowing and sow lactation feed intake. Calculated variables included the within-litter CV in birth weight (LtV), pre-weaning growth rate per litter (PWG), total litter gain (TLG), lactation efficiency and litter size reared after cross-fostering. Data were analysed using linear mixed models accounting for covariance among records. Third and fourth parity sows had more (P<0.05) TB, BA and heavier BtWT compared with gilts and parity 6 sow contemporaries. Parities 2 and 3 sows weaned more (P<0.05) piglets than older sows. These piglets had heavier (P<0.05) birth weights than those from gilt litters. LtV and PWM were greater (P<0.01) in litters born to parity 5 sows than those born to younger sows. Sow live-weight and back-fat depth at service, days 25 and 50 of gestation were not associated with TB, BA, BtWT, LtV, PWG, WnWT or lactation efficiency (P>0.05). Heavier sow live-weight throughout gestation was associated with an increase in PWM (P<0.01) and reduced Wn and lactation feed intake (P<0.05). Deeper back-fat in late gestation was associated with fewer (P<0.05) BA but heavier (P<0.05) BtWT, whereas deeper back-fat depth throughout gestation was associated with reduced (P<0.01) lactation feed intake. Sow back-fat depth was not associated with LtV, PWG, TLG, WSI or piglets born alive in subsequent farrowing (P>0.05). In conclusion, this study showed that sow parity, live-weight and back-fat depth can be used as indicators of reproductive performance. In addition, this study also provides validation for future development of a benchmarking tool to monitor and improve the productivity of modern sow herd.
Globally, an estimated 46 million people are currently living with dementia and this figure is projected to increase 3-fold by 2050, highlighting this major public health concern and its substantial associated healthcare costs. With pharmacological treatment yet to reach fruition, the emphasis on evidence-based preventative lifestyle strategies is becoming increasingly important and several modifiable lifestyle factors have been identified that may preserve cognitive health. These include good cardiovascular health, physical activity, low alcohol intake, smoking and a healthy diet, with growing interest in vitamin D. The aim of the present paper is to review the evidence supporting the potential roles of vitamin D in ageing and cognitive health in community-dwelling older adults. Furthermore, to describe the utility and challenges of cognitive assessments and outcomes when investigating vitamin D in this context. Evidence indicates that serum 25-hydroxyvitamin D (25(OH)D) may impact brain health. There is a biological plausibility from animal models that vitamin D may influence neurodegenerative disorders, through several mechanisms. Epidemiological evidence supports associations between low serum 25(OH)D concentrations and poorer cognitive performance in community-dwelling older populations, although an optimal 25(OH)D level for cognitive health could not be determined. The effect of raising 25(OH)D concentrations on cognitive function remains unclear, as there is a paucity of interventional evidence. At a minimum, it seems prudent to aim to prevent vitamin D deficiency in older adults, with other known common protective lifestyle factors, as a viable component of brain health strategies.
To describe the behavioural and psychiatric problems found in nursing home psychiatric referrals in the Dublin South city area.
We undertook two consecutive surveys of nursing home referrals to the St James’s Hospital psychiatry of old age service over a 2-year period. During the second survey a new clinical nurse specialist was specifically appointed to manage the seven nursing homes included in the study.
The most common reason for referral during survey one was uncooperative/aggressive behaviour (22%). For survey two, patients were most commonly referred for low mood (31%) or agitation (29%). During survey one, the majority of patients assessed were diagnosed with behavioural and psychological symptoms of dementia (41%). This was also a prevalent diagnosis during survey two, affecting 27% of those referred. Only 7% of patients were considered to be delirious during survey one. This rose to 31% the following year making it the most common diagnosis during survey two. Over the 2-year study period, 7% of referred patients were diagnosed with depression. In terms of prescribing practices, the discontinuation rate of antipsychotic mediation following psychiatric input was 13% in survey one. By survey two, this had risen to 47%.
Delirium is often undetected and untreated in nursing homes. Residents presenting with psychiatric symptoms should undergo routine bloods and urinalysis prior to psychiatric referral. Dedicated input from trained psychiatric nursing staff can lead to both an improvement in the recognition of delirium and reduced prescribing rates of antipsychotic medication.
The diagnosis of dementia remains inadequate, even within clinical settings. Data on rates and degree of impairment among inpatients are vital for service planning and the provision of appropriate patient care as Ireland's population ages.
Every patient aged 65 years and over admitted over a two-week period was invited to participate. Those who met inclusion criteria were screened for delirium then underwent cognitive screening. Demographic, functional, and outcome data were obtained from medical records, participants, and family.
Consent to participate was obtained from 68.6% of the eligible population. Data for 143 patients were obtained. Mean age 78.1 years. 27.3% met criteria for dementia and 21% had mild cognitive impairment (MCI). Only 41% of those with dementia and 10% of those with MCI had a previously documented impairment. Between-group analysis showed differences in length of stay (p = 0.003), number of readmissions in 12 months (p = 0.036), and likelihood of returning home (p = 0.039) between the dementia and normal groups. MCI outcomes were similar to the normal group. No difference was seen for one-year mortality. Effects were less pronounced on multivariate analysis but continued to show a significant effect on length of stay even after controlling for demographics, personal and family history, and anxiety and depression screening scores. Patients with dementia remained in hospital 15.3 days longer (p = 0.047). A diagnosis is the single biggest contributing factor to length of stay in our regression model.
Cognitive impairment is pervasive and under-recognized in the acute hospital and impacts negatively on patient outcomes.
Research shows that cognitive rehabilitation (CR) has the potential to improve goal performance and enhance well-being for people with early stage Alzheimer’s disease (AD). This single subject, multiple baseline design (MBD) research investigated the clinical efficacy of an 8-week individualised CR intervention for individuals with early stage AD.
Three participants with early stage AD were recruited to take part in the study. The intervention consisted of eight sessions of 60–90 minutes of CR. Outcomes included goal performance and satisfaction, quality of life, cognitive and everyday functioning, mood, and memory self-efficacy for participants with AD; and carer burden, general mental health, quality of life, and mood of carers.
Visual analysis of MBD data demonstrated a functional relationship between CR and improvements in participants’ goal performance. Subjective ratings of goal performance and satisfaction increased from baseline to post-test for three participants and were maintained at follow-up for two. Baseline to post-test quality of life scores improved for three participants, whereas cognitive function and memory self-efficacy scores improved for two.
Our findings demonstrate that CR can improve goal performance, and is a socially acceptable intervention that can be implemented by practitioners with assistance from carers between sessions. This study represents one of the promising first step towards filling a practice gap in this area. Further research and randomised-controlled trials are required.
The influence of party connection on the selection of judges has long been an issue in Canada This article considers whether such connections adversely affect the appointment of women judges to federally appointed courts. The answer appears to be yes. Using political donations as a proxy for party connection, the data analyzed here suggest that as the number of appointees with connections to the government rises, the number of women appointees falls. However, for appointments to provincial courts by the government of Ontario, the prevalence of political connections among judicial appointees is less prominent, suggesting that different systems of judicial appointment may help to lessen these effects.
To assess the burden of bloodstream infections (BSIs) among pediatric hematology-oncology (PHO) inpatients, to propose a comprehensive, all-BSI tracking approach, and to discuss how such an approach helps better inform within-center and across-center differences in CLABSI rate
Prospective cohort study
US multicenter, quality-improvement, BSI prevention network
PHO centers across the United States who agreed to follow a standardized central-line–maintenance care bundle and track all BSI events and central-line days every month.
Infections were categorized as CLABSI (stratified by mucosal barrier injury–related, laboratory-confirmed BSI [MBI-LCBI] versus non–MBI-LCBI) and secondary BSI, using National Healthcare Safety Network (NHSN) definitions. Single positive blood cultures (SPBCs) with NHSN defined common commensals were also tracked.
Between 2013 and 2015, 34 PHO centers reported 1,110 BSIs. Among them, 708 (63.8%) were CLABSIs, 170 (15.3%) were secondary BSIs, and 232 (20.9%) were SPBCs. Most SPBCs (75%) occurred in patients with profound neutropenia; 22% of SPBCs were viridans group streptococci. Among the CLABSIs, 51% were MBI-LCBI. Excluding SPBCs, CLABSI rates were higher (88% vs 77%) and secondary BSI rates were lower (12% vs 23%) after the NHSN updated the definition of secondary BSI (P<.001). Preliminary analyses showed across-center differences in CLABSI versus secondary BSI and between SPBC and CLABSI versus non-CLABSI rates.
Tracking all BSIs, not just CLABSIs in PHO patients, is a patient-centered, clinically relevant approach that could help better assess across-center and within-center differences in infection rates, including CLABSI. This approach enables informed decision making by healthcare providers, payors, and the public.
The logic layer is the central nervous system of cyberspace. It is responsible for routing data packages to their final destinations, primarily via domain name systems (DNS), Internet protocols, browsers, Web sites, and software, all of which rely on the aforementioned fiber optic cables and physical foundations. Targeted cyber attacks can manipulate the logic layer of cyberspace in a number of ways to cause it to malfunction or shut down completely in order to inhibit the flow of data.
The logic layer of cyberspace can be attacked and altered in a variety of ways through cyber attacks. Many of the central elements of the logic layer are under attack every day as malicious actors attempt to break the system. And unlike a physical attack on infrastructure, which would require time, coordination, and access in order to damage enough elements to successfully cut a state off from cyberspace, well-designed cyber attacks at the logic layer can be designed to hit multiple key nodes at once. Thus, while technologically it is much more difficult to attack the key nodes of the logic layer, there is a synergy present that could make a single, advanced cyber attack more successful at creating an A2/AD environment than many coordinated attacks on the physical infrastructure.
There are some safeguards in place to diminish the risk of cyber attacks that target these systems, including redundancies and the ability to reroute traffic through an uncompromised server. For example, when all thirteen of the Internet root servers were attacked simultaneously in 2002, several servers were able to withstand the attack and continued to operate, thus keeping the Internet functioning despite the fact that several servers were temporarily shut down by the attack. Despite this attack over a decade ago, vulnerabilities still exist and can be exploited. Chief among them are the operating systems that manage the wavelengths of the fiber optic cables as they come ashore at landing sites. Using these systems, hackers can manipulate the wavelengths to alter or remove some or all of the data traffic on that cable, potentially without the operator's knowledge.