To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
HIV-associated neurocognitive disorders (HANDs) are prevalent in older people living with HIV (PLWH) worldwide. HAND prevalence and incidence studies of the newly emergent population of combination antiretroviral therapy (cART)-treated older PLWH in sub-Saharan Africa are currently lacking. We aimed to estimate HAND prevalence and incidence using robust measures in stable, cART-treated older adults under long-term follow-up in Tanzania and report cognitive comorbidities.
A systematic sample of consenting HIV-positive adults aged ≥50 years attending routine clinical care at an HIV Care and Treatment Centre during March–May 2016 and followed up March–May 2017.
HAND by consensus panel Frascati criteria based on detailed locally normed low-literacy neuropsychological battery, structured neuropsychiatric clinical assessment, and collateral history. Demographic and etiological factors by self-report and clinical records.
In this cohort (n = 253, 72.3% female, median age 57), HAND prevalence was 47.0% (95% CI 40.9–53.2, n = 119) despite well-managed HIV disease (Mn CD4 516 (98-1719), 95.5% on cART). Of these, 64 (25.3%) were asymptomatic neurocognitive impairment, 46 (18.2%) mild neurocognitive disorder, and 9 (3.6%) HIV-associated dementia. One-year incidence was high (37.2%, 95% CI 25.9 to 51.8), but some reversibility (17.6%, 95% CI 10.0–28.6 n = 16) was observed.
HAND appear highly prevalent in older PLWH in this setting, where demographic profile differs markedly to high-income cohorts, and comorbidities are frequent. Incidence and reversibility also appear high. Future studies should focus on etiologies and potentially reversible factors in this setting.
There is global interest in the reconfiguration of community mental health services, including primary care, to improve clinical and cost effectiveness.
This study seeks to describe patterns of service use, continuity of care, health risks, physical healthcare monitoring and the balance between primary and secondary mental healthcare for people with severe mental illness in receipt of secondary mental healthcare in the UK.
We conducted an epidemiological medical records review in three UK sites. We identified 297 cases randomly selected from the three participating mental health services. Data were manually extracted from electronic patient medical records from both secondary and primary care, for a 2-year period (2012–2014). Continuous data were summarised by mean and s.d. or median and interquartile range (IQR). Categorical data were summarised as percentages.
The majority of care was from secondary care practitioners: of the 18 210 direct contacts recorded, 76% were from secondary care (median, 36.5; IQR, 14–68) and 24% were from primary care (median, 10; IQR, 5–20). There was evidence of poor longitudinal continuity: in primary care, 31% of people had poor longitudinal continuity (Modified Modified Continuity Index ≤0.5), and 43% had a single named care coordinator in secondary care services over the 2 years.
The study indicates scope for improvement in supporting mental health service delivery in primary care. Greater knowledge of how care is organised presents an opportunity to ensure some rebalancing of the care that all people with severe mental illness receive, when they need it. A future publication will examine differences between the three sites that participated in this study.
Emotional cognition and effective interpretation of affective information is an important factor in social interactions and everyday functioning, and difficulties in these areas may contribute to aetiology and maintenance of mental health conditions. In younger people with depression and anxiety, research suggests significant alterations in behavioural and brain activation aspects of emotion processing, with a tendency to appraise neutral stimuli as negative and attend preferentially to negative stimuli. However, in ageing, research suggests that emotion processing becomes subject to a ‘positivity effect’, whereby older people attend more to positive than negative stimuli.
This review examines data from studies of emotion processing in Late-Life Depression and Late-Life Anxiety to attempt to understand the significance of emotion processing variations in these conditions, and their interaction with changes in emotion processing that occur with ageing.
We conducted a systematic review following PRISMA guidelines. Articles that used an emotion-based processing task, examined older persons with depression or an anxiety disorder and included a healthy control group were included.
In Late-Life Depression, there is little consistent behavioural evidence of impaired emotion processing, but there is evidence of altered brain circuitry during these processes. In Late-Life Anxiety and Post-Traumatic Stress disorder, there is evidence of interference with processing of negative or threat-related words.
How these findings fit with the positivity bias of ageing is not clear. Future research is required in larger groups, further examining the interaction between illness and age and the significance of age at disease onset.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
Decisions on the use of nature reflect the values and rights of individuals, communities and society at large. The values of nature are expressed through cultural norms, rules and legislation, and they can be elicited using a wide range of tools, including those of economics. None of the approaches to elicit peoples’ values are neutral. Unequal power relations influence valuation and decision-making and are at the core of most environmental conflicts. As actors in sustainability thinking, environmental scientists and practitioners are becoming more aware of their own posture, normative stance, responsibility and relative power in society. Based on a transdisciplinary workshop, our perspective paper provides a normative basis for this new community of scientists and practitioners engaged in the plural valuation of nature.
To determine whether the Society for Healthcare Epidemiology of America (SHEA) and the Infectious Diseases Society of America (IDSA) Clostridioides difficile infection (CDI) severity criteria adequately predicts poor outcomes.
Retrospective validation study.
Setting and participants:
Patients with CDI in the Veterans’ Affairs Health System from January 1, 2006, to December 31, 2016.
For the 2010 criteria, patients with leukocytosis or a serum creatinine (SCr) value ≥1.5 times the baseline were classified as severe. For the 2018 criteria, patients with leukocytosis or a SCr value ≥1.5 mg/dL were classified as severe. Poor outcomes were defined as hospital or intensive care admission within 7 days of diagnosis, colectomy within 14 days, or 30-day all-cause mortality; they were modeled as a function of the 2010 and 2018 criteria separately using logistic regression.
We analyzed data from 86,112 episodes of CDI. Severity was unclassifiable in a large proportion of episodes diagnosed in subacute care (2010, 58.8%; 2018, 49.2%). Sensitivity ranged from 0.48 for subacute care using 2010 criteria to 0.73 for acute care using 2018 criteria. Areas under the curve were poor and similar (0.60 for subacute care and 0.57 for acute care) for both versions, but negative predictive values were >0.80.
Model performances across care settings and criteria versions were generally poor but had reasonably high negative predictive value. Many patients in the subacute-care setting, an increasing fraction of CDI cases, could not be classified. More work is needed to develop criteria to identify patients at risk of poor outcomes.
Protected areas are central to global efforts to prevent species extinctions, with many countries investing heavily in their establishment. Yet the designation of protected areas alone can only abate certain threats to biodiversity. Targeted management within protected areas is often required to achieve fully effective conservation within their boundaries. It remains unclear what combination of protected area designation and management is needed to remove the suite of processes that imperil species. Here, using Australia as a case study, we use a dataset on the pressures facing threatened species to determine the role of protected areas and management in conserving imperilled species. We found that protected areas that are not resourced for threat management could remove one or more threats to 1,185 (76%) species and all threats to very few (n = 51, 3%) species. In contrast, a protected area network that is adequately resourced to manage threatening processes within their boundary could remove one or more threats to almost all species (n = 1,551; c. 100%) and all threats to almost half (n = 740, 48%). However, 815 (52%) species face one or more threats that require coordinated conservation actions that protected areas alone could not remove. This research shows that investing in the continued expansion of Australia's protected area network without providing adequate funding for threat management within and beyond the existing protected area network will benefit few threatened species. These findings highlight that as the international community expands the global protected area network in accordance with the 2020 Strategic Plan for Biodiversity, a greater emphasis on the effectiveness of threat management is needed.
Connectedness is a central dimension of personal recovery from severe mental illness (SMI). Research reports that people with SMI have lower social capital and poorer-quality social networks compared to the general population.
To identify personal well-being network (PWN) types and explore additional insights from mapping connections to places and activities alongside social ties.
We carried out 150 interviews with individuals with SMI and mapped social ties, places and activities and their impact on well-being. PWN types were developed using social network analysis and hierarchical k-means clustering of this data.
Three PWN types were identified: formal and sparse; family and stable; and diverse and active. Well-being and social capital varied within and among types. Place and activity data indicated important contextual differences within social connections that were not found by mapping social networks alone.
Place locations and meaningful activities are important aspects of people's social worlds. Mapped alongside social networks, PWNs have important implications for person-centred recovery approaches through providing a broader understanding of individual's lives and resources.
Objectives: Agenesis of the corpus callosum (AgCC), characterized by developmental absence of the corpus callosum, is one of the most common congenital brain malformations. To date, there are limited data on the neuropsychological consequences of AgCC and factors that modulate different outcomes, especially in children. This study aimed to describe general intellectual, academic, executive, social and behavioral functioning in a cohort of school-aged children presenting for clinical services to a hospital and diagnosed with AgCC. The influences of age, social risk and neurological factors were examined. Methods: Twenty-eight school-aged children (8 to 17 years) diagnosed with AgCC completed tests of general intelligence (IQ) and academic functioning. Executive, social and behavioral functioning in daily life, and social risk, were estimated from parent and teacher rated questionnaires. MRI findings reviewed by a pediatric neurologist confirmed diagnosis and identified brain characteristics. Clinical details including the presence of epilepsy and diagnosed genetic condition were obtained from medical records. Results: In our cohort, ~50% of children experienced general intellectual, academic, executive, social and/or behavioral difficulties and ~20% were functioning at a level comparable to typically developing children. Social risk was important for understanding variability in neuropsychological outcomes. Brain anomalies and complete AgCC were associated with lower mathematics performance and poorer executive functioning. Conclusions: This is the first comprehensive report of general intellectual, academic, executive social and behavioral consequences of AgCC in school-aged children. The findings have important clinical implications, suggesting that support to families and targeted intervention could promote positive neuropsychological functioning in children with AgCC who come to clinical attention. (JINS, 2018, 24, 445–455)
Artemether (ATM) cardiotoxicity, its short half-life and low oral bioavailability are the major limiting factors for its use to treat malaria. The purposes of this work were to study free-ATM and ATM-loaded poly-ε-caprolactone nanocapules (ATM-NC) cardiotoxicity and oral efficacy on Plasmodium berghei-infected mice. ATM-NC was obtained by interfacial polymer deposition and ATM was associated with polymeric NC oily core. For cardiotoxicity evaluation, male black C57BL6 uninfected or P. berghei-infected mice received, by oral route twice daily/4 days, vehicle (sorbitol/carboxymethylcellulose), blank-NC, free-ATM or ATM-NC at doses 40, 80 or 120 mg kg−1. Electrocardiogram (ECG) lead II signal was obtained before and after treatment. For ATM efficacy evaluation, female P. berghei-infected mice were treated the same way. ATM-NC improved antimalarial in vivo efficacy and reduced mice mortality. Free-ATM induced significantly QT and QTc intervals prolongation. ATM-NC (120 mg kg−1) given to uninfected mice reduced QT and QTc intervals prolongation 34 and 30%, respectively, compared with free-ATM. ATM-NC given to infected mice also reduced QT and QTc intervals prolongation, 28 and 27%, respectively. For the first time, the study showed a nanocarrier reducing cardiotoxicity of ATM given by oral route and it was more effective against P. berghei than free-ATM as monotherapy.
Analysis of in situ neutron powder diffraction data collected for the porous framework material Zn(hba) during gas adsorption reveals a two-stage response of the host lattice to increasing CO2 guest concentration, suggesting progressive occupation of multiple CO2 adsorption sites with different binding strengths. The response of the lattice to moderate CH4 guest concentrations is virtually indistinguishable from the response to CO2, demonstrating that the influence of host–guest interactions on the Zn(hba) framework is defined more strongly by the concentration than by the identity of the guests.
The adverse environmental impacts of projects supported by Multilateral Development Banks (MDBs), such as the World Bank or its regional counterparts, have been denounced for decades. The domains in which MDBs operate logically bear environmental and social risk which can be significant: development of transportation, of agribusiness, of energy sources, of extractive industries etc. This includes projects to develop highways, airports, dams and reservoirs, irrigation systems, wind farms, coal power plants, mining, to reorganize land management, to reform the legal framework related to land tenures or else forest concessions etc. Such projects may entail changing land use patterns and natural habitats, or else cause disruptions affecting the water cycle, biodiversity, soil, forests … Not to mention the human impacts: ‘involuntary’ (in the language of MDBs) and sometimes unwanted resettlement, destruction of cultural or spiritual heritage, loss of livelihoods, forced evictions etc. The poor environmental record of the World Bank Group is richly documented, including by the World Bank itself, thanks to the reports of the Operations Evaluation Department, later transformed into the Independent Evaluation Group (IEG). For example, the 2001 OED Review of the Bank's Performance on the Environment notes that:
“To be sure, these achievements fell short of the expectations of many of its stakeholders. The momentum of the early 1990s dissipated in the face of constraints faced in the operating environment. Internally, environmental sustainability was not adequately integrated into the Bank's core objectives and country assistance strategies. Intellectually, the linkages between macroeconomic policy, poverty alleviation, and environmental sustainability were not explicitly forged. In sum, the institution's environmental efforts have not been consistent nor have they been held to uniform quality standards. Yet, staffhave carried out many worthwhile activities related to the environment (…) This OED report finds that the Bank has made progress on the environment, and notes that its commitments were not accompanied by precise goals and performance monitoring.”
In 2008, the IEG finds that:
“When requested, the Bank Group has been generally able to help countries set environmental priorities (although this is ultimately the responsibility of the countries themselves) and private sector clients to identify and address potential direct environmental impacts.
The purpose of this study was to quantify the effect of multidrug-resistant (MDR) gram-negative bacteria and methicillin-resistant Staphylococcus aureus (MRSA) healthcare-associated infections (HAIs) on mortality following infection, regardless of patient location.
We conducted a retrospective cohort study of patients with an inpatient admission in the US Department of Veterans Affairs (VA) system between October 1, 2007, and November 30, 2010. We constructed multivariate log-binomial regressions to assess the impact of a positive culture on mortality in the 30- and 90-day periods following the first positive culture, using a propensity-score–matched subsample.
Patients identified with positive cultures due to MDR Acinetobacter (n=218), MDR Pseudomonas aeruginosa (n=1,026), and MDR Enterobacteriaceae (n=3,498) were propensity-score matched to 14,591 patients without positive cultures due to these organisms. In addition, 3,471 patients with positive cultures due to MRSA were propensity-score matched to 12,499 patients without positive MRSA cultures. Multidrug-resistant gram-negative bacteria were associated with a significantly elevated risk of mortality both for invasive (RR, 2.32; 95% CI, 1.85–2.92) and noninvasive cultures (RR, 1.33; 95% CI, 1.22–1.44) during the 30-day period. Similarly, patients with MRSA HAIs (RR, 2.77; 95% CI, 2.39–3.21) and colonizations (RR, 1.32; 95% CI, 1.22–1.50) had an increased risk of death at 30 days.
We found that HAIs due to gram-negative bacteria and MRSA conferred significantly elevated 30- and 90-day risks of mortality. This finding held true both for invasive cultures, which are likely to be true infections, and noninvasive infections, which are possibly colonizations.
Estimates of the excess length of stay (LOS) attributable to healthcare-associated infections (HAIs) in which total LOS of patients with and without HAIs are biased because of failure to account for the timing of infection. Alternate methods that appropriately treat HAI as a time-varying exposure are multistate models and cohort studies, which match regarding the time of infection. We examined the magnitude of this time-dependent bias in published studies that compared different methodological approaches.
We conducted a systematic review of the published literature to identify studies that report attributable LOS estimates using both total LOS (time-fixed) methods and either multistate models or matching patients with and without HAIs using the timing of infection.
Of the 7 studies that compared time-fixed methods to multistate models, conventional methods resulted in estimates of the LOS to HAIs that were, on average, 9.4 days longer or 238% greater than those generated using multistate models. Of the 5 studies that compared time-fixed methods to matching on timing of infection, conventional methods resulted in estimates of the LOS to HAIs that were, on average, 12.6 days longer or 139% greater than those generated by matching on timing of infection.
Our results suggest that estimates of the attributable LOS due to HAIs depend heavily on the methods used to generate those estimates. Overestimation of this effect can lead to incorrect assumptions of the likely cost savings from HAI prevention measures.
Infect. Control Hosp. Epidemiol. 2015;36(9):1089–1094
Standard estimates of the impact of Clostridium difficile infections (CDI) on inpatient lengths of stay (LOS) may overstate inpatient care costs attributable to CDI. In this study, we used multistate modeling (MSM) of CDI timing to reduce bias in estimates of excess LOS.
A retrospective cohort study of all hospitalizations at any of 120 acute care facilities within the US Department of Veterans Affairs (VA) between 2005 and 2012 was conducted. We estimated the excess LOS attributable to CDI using an MSM to address time-dependent bias. Bootstrapping was used to generate 95% confidence intervals (CI). These estimates were compared to unadjusted differences in mean LOS for hospitalizations with and without CDI.
During the study period, there were 3.96 million hospitalizations and 43,540 CDIs. A comparison of unadjusted means suggested an excess LOS of 14.0 days (19.4 vs 5.4 days). In contrast, the MSM estimated an attributable LOS of only 2.27 days (95% CI, 2.14–2.40). The excess LOS for mild-to-moderate CDI was 0.75 days (95% CI, 0.59–0.89), and for severe CDI, it was 4.11 days (95% CI, 3.90–4.32). Substantial variation across the Veteran Integrated Services Networks (VISN) was observed.
CDI significantly contributes to LOS, but the magnitude of its estimated impact is smaller when methods are used that account for the time-varying nature of infection. The greatest impact on LOS occurred among patients with severe CDI. Significant geographic variability was observed. MSM is a useful tool for obtaining more accurate estimates of the inpatient care costs of CDI.
Infect. Control Hosp. Epidemiol. 2015;36(9):1024–1030
Whole-grain intake has been reported to be associated with a lower risk of several lifestyle-related diseases such as type 2 diabetes, CVD and some types of cancers. As measurement errors in self-reported whole-grain intake assessments can be substantial, dietary biomarkers are relevant to be used as complementary tools for dietary intake assessment. Alkylresorcinols (AR) are phenolic lipids found almost exclusively in whole-grain wheat and rye products among the commonly consumed foods and are considered as valid biomarkers of the intake of these products. In the present study, we analysed the plasma concentrations of five AR homologues in 2845 participants from ten European countries from a nested case–control study in the European Prospective Investigation into Cancer and Nutrition. High concentrations of plasma total AR were found in participants from Scandinavia and Central Europe and lower concentrations in those from the Mediterranean countries. The geometric mean plasma total AR concentrations were between 35 and 41 nmol/l in samples drawn from fasting participants in the Central European and Scandinavian countries and below 23 nmol/l in those of participants from the Mediterranean countries. The whole-grain source (wheat or rye) could be determined using the ratio of two of the homologues. The main source was wheat in Greece, Italy, the Netherlands and the UK, whereas rye was also consumed in considerable amounts in Germany, Denmark and Sweden. The present study demonstrates a considerable variation in the plasma concentrations of total AR and concentrations of AR homologues across ten European countries, reflecting both quantitative and qualitative differences in the intake of whole-grain wheat and rye.