To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
The focus of community ecology has shifted from the description of taxonomic composition towards an understanding of community assembly based on species’ ‘functional traits’. The functional trait approach is well developed for vascular plants, utilising variability of continuous phenotypic characters that affect ecological fitness, such as specific leaf area, tissue nitrogen concentration or seed mass, to explain community structure. In contrast, community assembly studies for poikilohydric cryptogamic plants and fungi, such as lichens, remain focused on broad categorical traits such as growth form difference: fruticose, foliose or crustose. This study examined intra- and interspecific variability for two highly promising continuous phenotypic measurements that affect lichen physiology and ecological fitness: water-holding capacity (WHC) and specific thallus mass (STM). Values for WHC and STM were compared within and among species, and within and among key macrolichen growth forms (fruticose and green-algal and cyanolichen foliose species), asking whether these widely used categories adequately differentiate the continuous variables (WHC and STM). We show large intra- and interspecific variability that does not map satisfactorily onto growth form categories, and on this basis provide recommendations and caveats in the future use of lichen functional traits.
Systematic data collection for direct statistical analysis of biodiversity trends tends to be focused on charismatic fauna and flora such as birds or vascular plants. When subsequently applied by conservation agencies in summary metrics tracking habitat and species protection, these patterns in biodiversity loss or gain can fail to capture outcomes for groups that have a prominent importance in habitat composition, diversity and ecological function, such as algae, bryophytes, lichens and other fungi. Such species are primarily recorded on an ad hoc basis by taxonomic specialists, yielding noisy data that present problems in robustly identifying trends. This study explored the use of ad hoc field-recorded data as a potential source of biodiversity information, by comparing the pattern of recording for carefully selected indicator species with those for benchmark or control species as a proxy for recording effort. Focusing on Scotland’s internationally important epiphytic lichens, and especially ‘old-growth’ indicator species, British Lichen Society data revealed a decline in the extent of these species in Scotland, relative to recording effort, over a period of five decades. A recent slowing in the rate of decline is observed but remains to be confirmed. The long-term decline is consistent with the effect of land use intensification, resulting in small and isolated populations that are vulnerable to extinction debt. We caution that remedial protection and monitoring for such populations remains vital as a complement to Scotland’s larger scale ambition for increased woodland extent and connectivity.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Flexible piezoelectric generators (PEGs) present a unique opportunity for renewable and sustainable energy harvesting. Here, we present a low-temperature and low-energy deposition method using solvent evaporation-assisted three-dimensional printing to deposit electroactive poly(vinylidene fluoride) (PVDF)-trifluoroethylene (TrFE) up to 19 structured layers. Visible-wavelength transmittance was above 92%, while ATR-FTIR spectroscopy showed little change in the electroactive phase fraction between layer depositions. Electroactivity from the fabricated PVDF-TrFE PEGs showed that a single structured layer gave the greatest output at 289.3 mV peak-to-peak voltage. This was proposed to be due to shear-induced polarization affording the alignment of the fluoropolymer dipoles without an electric field or high temperature.
Introduction: Emergency departments (ED) across Canada acknowledge the need to transform in order to provide high quality care for the increasing proportion of older patients presenting for treatment. Older people are more complex than younger ED users. They have a disproportionately high use of EDs, increased rates of hospitalization, and are more likely to suffer adverse events. The objective of this initiative was to develop minimum standards for the care of older people in the emergency department. Methods: We created a panel of international leaders in geriatrics and emergency medicine to develop a policy framework on minimum standards for care of older people in the ED. We conducted a literature review of international guidelines, frameworks, recommendations, and best practices for the acute care of older people and developed a draft standards document. This preliminary document was circulated to interdisciplinary members of the International Federation of Emergency Medicine (IFEM) geriatric emergency medicine (GEM) group. Following review, the standards were presented to the IFEM clinical practice group. At each step, verbal, written and online feedback were gathered and integrated into the final minimum standards document. Results: Following the developmental process, a series of eight minimum standard statements were created and accepted by IFEM. These standards utilise the IFEM Framework for Quality and Safety in the ED, and are centred on the recognition that older people are a core population of emergency health service users whose care needs are different from those of children and younger adults. They cover key areas, including the overall approach to older patients, the physical environment and equipment, personnel and training, policies and protocols, and strategies for navigating the health-care continuum. Conclusion: These standards aim to improve the evaluation, management and integration of care of older people in the ED in an effort to improve outcomes. The minimum standards represent a first step on which future activities can be built, including the development of specific indicators for each of the minimum standards. The standards are designed to apply across the spectrum of EDs worldwide, and it is hoped that they will act as a catalyst to change.
Recent evidence suggests that exercise plays a role in cognition and that the posterior cingulate cortex (PCC) can be divided into dorsal and ventral subregions based on distinct connectivity patterns.
To examine the effect of physical activity and division of the PCC on brain functional connectivity measures in subjective memory complainers (SMC) carrying the epsilon 4 allele of apolipoprotein E (APOE 4) allele.
Participants were 22 SMC carrying the APOE ɛ4 allele (ɛ4+; mean age 72.18 years) and 58 SMC non-carriers (ɛ4–; mean age 72.79 years). Connectivity of four dorsal and ventral seeds was examined. Relationships between PCC connectivity and physical activity measures were explored.
ɛ4+ individuals showed increased connectivity between the dorsal PCC and dorsolateral prefrontal cortex, and the ventral PCC and supplementary motor area (SMA). Greater levels of physical activity correlated with the magnitude of ventral PCC–SMA connectivity.
The results provide the first evidence that ɛ4+ individuals at increased risk of cognitive decline show distinct alterations in dorsal and ventral PCC functional connectivity.
The Virology Department, VSD, DARDNI, in collaboration with colleagues in Europe and N America has had an active research programme on porcine circovirus-related diseases, including PMWS, for the last 5 years. This presentation will highlight some of our on-going research in this area.
Objectives: Preterm children demonstrate deficits in executive functions including inhibition, working memory, and cognitive flexibility; however, their goal setting abilities (planning, organization, strategic reasoning) remain unclear. This study compared goal setting abilities between very preterm (VP: <30 weeks/<1250 grams) and term born controls during late childhood. Additionally, early risk factors (neonatal brain abnormalities, medical complications, and sex) were examined in relationship to goal setting outcomes within the VP group. Methods: Participants included 177 VP and 61 full-term born control children aged 13 years. Goal setting was assessed using several measures of planning, organization, and strategic reasoning. Parents also completed the Behavior Rating Inventory of Executive Function. Regression models were performed to compare groups, with secondary analyses adjusting for potential confounders (sex and social risk), and excluding children with major neurosensory impairment and/or IQ<70. Within the VP group, regression models were performed to examine the relationship between brain abnormalities, medical complications, and sex, on goal setting scores. Results: The VP group demonstrated a clear pattern of impairment and inefficiency across goal setting measures, consistent with parental report, compared with their full-term born peers. Within the VP group, moderate/severe brain abnormalities on neonatal MRI predicted adverse goal setting outcomes at 13. Conclusions: Goal setting difficulties are a significant area of concern in VP children during late childhood. These difficulties are associated with neonatal brain abnormalities, and are likely to have functional consequences academically, socially and vocationally. (JINS, 2018, 24, 372–381)
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)
In-spiraling supermassive black holes should emit gravitational waves, which would produce characteristic distortions in the time of arrival residuals from millisecond pulsars. Multiple national and regional consortia have constructed pulsar timing arrays by precise timing of different sets of millisecond pulsars. An essential aspect of precision timing is the transfer of the times of arrival to a (quasi-)inertial frame, conventionally the solar system barycenter. The barycenter is determined from the knowledge of the planetary masses and orbits, which has been refined over the past 50 years by multiple spacecraft. Within the North American Nanohertz Observatory for Gravitational Waves (NANOGrav), uncertainties on the solar system barycenter are emerging as an important element of the NANOGrav noise budget. We describe what is known about the solar system barycenter, touch upon how uncertainties in it affect gravitational wave studies with pulsar timing arrays, and consider future trends in spacecraft navigation.
A global array of 20 radio observatories was used to measure the three-dimensional position and velocity of the two meteorological balloons that were injected into the equatorial region of the Venus atmosphere by the VEGA spacecraft.
To investigate the feasibility of a national audit of epistaxis management led and delivered by a multi-region trainee collaborative using a web-based interface to capture patient data.
Six trainee collaboratives across England nominated one site each and worked together to carry out this pilot. An encrypted data capture tool was adapted and installed within the infrastructure of a university secure server. Site-lead feedback was assessed through questionnaires.
Sixty-three patients with epistaxis were admitted over a two-week period. Site leads reported an average of 5 minutes to complete questionnaires and described the tool as easy to use. Data quality was high, with little missing data. Site-lead feedback showed high satisfaction ratings for the project (mean, 4.83 out of 5).
This pilot showed that trainee collaboratives can work together to deliver an audit using an encrypted data capture tool cost-effectively, whilst maintaining the highest levels of data quality.
Field survey by a taxonomist or specialist biologist (‘taxonomic survey’) provides a comprehensive inventory of species in a habitat. Common and conspicuous species are rapidly recorded and search effort can be targeted to inconspicuous or rare species. However, the subjective nature of taxonomic survey limits its usefulness in ecological monitoring and analysis. In contrast, ‘ecological sampling’, focused on the standardized use of repeated sub-units such as quadrats, is designed to quantify the observational error of results, allowing for more robust statistical treatment. Nevertheless, the spatial extent of recording will be lower during ecological sampling, and rarities might be missed. Despite their differences, these two approaches are often assumed to be congruent for decision making. Taxonomic survey is commonly used to identify priority sites for conservation (including species-rich sites, or those with many rare/threatened species) while ecological sampling is used to design conservation strategy by relating species richness or composition to habitat dynamics. If these contrasting approaches are indeed congruent, then trends in species richness and community composition, detected by ecological sampling, will mirror the results of taxonomic survey so that management confidently protects the attributes for which a site was prioritized. This study performed both taxonomic survey and ecological sampling for lichen epiphytes in 13 woodland study sites in Scotland. To understand the procedure of taxonomic survey, fieldwork by a professional taxonomist was structured by effort into 15-minute time intervals. As expected, taxonomic survey discovered more species per site, while ecological sampling (allowing a measure of species frequency) resolved greater variation in community composition. However, the patterns of richness and species composition obtained from the different methods were correlated, suggesting an overall high degree of congruence in identifying and then managing priority sites. Furthermore, when exploring the taxonomic survey in detail, we found that a minimum effort of 45 minutes was required to accurately determine species richness differences among contrasting woodland sites.
The Southern Ocean is the largest of the high-nutrient, low-chlorophyll (HNLC) regions of the world ocean. Phytoplankton production fails to utilise completely the pool of inorganic nutrients in the euphotic zone, giving rise to low phytoplankton bio-mass and leaving relatively high summer nutrient concentrations. This enigma is of considerable significance for our understanding of the role of the oceans in the global carbon cycle. Various limiting factors have been considered: low light, low temperature, absence of necessary trace elements, grazing pressure and other means of biomass removal.
The dynamics of nitrogen uptake by phytoplankton are of particular importance. Classically, nitrate mixed into the surface layer during winter provides the nitrogen pool for growth in the spring bloom. Some organic material is exported to depth, whilst the remainder is recycled, providing ammonium and other reduced species as nitrogenous substrates for growth during the remainder of the season. The oxidation state of the inorganic nitrogen supply thus identifies new and recycled carbon fixation. Whilst this is convenient “shorthand” for the nitrogen nutrition of carbon export in much of the ocean, it is an inappropriate model for the Southern Ocean. Here, nitrate and ammonium use are simultaneous, and nitrate is never exhausted by the annual phytoplankton production.
We speculate that a range of environmental factors combine to make the large pool of nitrate partially inaccessible to phytoplankton. in addition to the documented effects of low iron availability and high ammonium concentrations, the low temperatures characteristic of the Southern Ocean may decrease nitrate availability because of the increased energetic overheads in its uptake and reduction. This in turn makes ammonium an important nitrogenous substrate, and its production by zooplankton and heterotrophic microorganisms is an important component of the plankton nitrogen cycle. There is some evidence that ammonium production by large grazing animals may stimulate phytoplankton growth. Microbial removal of nitrogen from sedimenting phytoplankton cells may result in local decoupling between the carbon and nitrogen cycles, allowing some reduced nitrogen to remain in the euphotic zone whilst carbon is exported to depth.
Recent studies point to overlap between neuropsychiatric disorders in symptomatology and genetic aetiology.
To systematically investigate genomics overlap between childhood and adult attention-deficit hyperactivity disorder (ADHD), autism spectrum disorder (ASD) and major depressive disorder (MDD).
Analysis of whole-genome blood gene expression and genetic risk scores of 318 individuals. Participants included individuals affected with adult ADHD (n = 93), childhood ADHD (n = 17), MDD (n = 63), ASD (n = 51), childhood dual diagnosis of ADHD–ASD (n = 16) and healthy controls (n = 78).
Weighted gene co-expression analysis results reveal disorder-specific signatures for childhood ADHD and MDD, and also highlight two immune-related gene co-expression modules correlating inversely with MDD and adult ADHD disease status. We find no significant relationship between polygenic risk scores and gene expression signatures.
Our results reveal disorder overlap and specificity at the genetic and gene expression level. They suggest new pathways contributing to distinct pathophysiology in psychiatric disorders and shed light on potential shared genomic risk factors.
We summarise the first year of operation of the Medium Deep Survey - a key project of the HST. Two fields in the LMC are discussed and some preliminary scientific results presented. We also comment on image deconvolution for the extragalactic fields observed as part of the Medium Deep Survey.
With HST and WFPC2, galaxies in the Medium Deep Survey can be reliably classified to magnitudes I814 ≲ 22.0 in the F814W band, at a mean redshift . The main result is the relatively high proportion (~40%) of objects which are in some way irregular or anomalous, and which are of relevance in understanding the origin of the familiar excess population of faint galaxies. These diverse objects include compact galaxies, apparently interacting pairs, galaxies with superluminous starforming regions and diffuse low surface brightness galaxies of various forms. The ‘irregulars’ and ‘peculiar’ galaxies contribute most of the excess counts in the I-band at our limiting magnitude, and may explain the ‘faint blue galaxy’ problem.
We studied the spread of influenza in the community between 1993 and 2009 using primary-care surveillance data to investigate if the onset of influenza was age-related. Virus detections [A(H3N2), B, A(H1N1)] and clinical incidence of influenza-like illness (ILI) in 12·3 million person-years in the long-running Royal College of General Practitioners-linked clinical-virological surveillance programme in England & Wales were examined. The number of days between symptom onset and the all-age peak ILI incidence were compared by age group for each influenza type/subtype. We found that virus detection and ILI incidence increase, peak and decrease were in unison. The mean interval between symptom onset to peak ILI incidence in virus detections (all ages) was: A(H3N2) 20·5 [95% confidence interval (CI) 19·7–21·6] days; B, 18·8 (95% CI 15·8·0–21·7) days; and A(H1N1) 17·0 (95% CI 15·6–18·4) days. Differences by age group were examined using the Kruskal–Wallis test. For A(H3N2) and A(H1N1) viruses the interval was similar in each age group. For influenza B there were highly significant differences by age group (P = 0·0001). Clinical incidence rates of ILI reported in the 8 weeks preceding the period of influenza virus activity were used to estimate a baseline incidence and threshold value (upper 95% CI of estimate) which was used as a marker of epidemic progress. Differences between the age groups in the week in which the threshold was reached were small and not localized to any age group. In conclusion we found no evidence to suggest that influenza A(H3N2) and A(H1N1) occurs in the community in one age group before another. For influenza B, virus detection was earlier in children aged 5–14 years than in persons aged ⩾25 years.
Seasonal respiratory infections place an increased burden on health services annually. We used a sentinel emergency department syndromic surveillance system to understand the factors driving respiratory attendances at emergency departments (EDs) in England. Trends in different respiratory indicators were observed to peak at different points during winter, with further variation observed in the distribution of attendances by age. Multiple linear regression analysis revealed acute respiratory infection and bronchitis/bronchiolitis ED attendances in patients aged 1–4 years were particularly sensitive indicators for increasing respiratory syncytial virus activity. Using near real-time surveillance of respiratory ED attendances may provide early warning of increased winter pressures in EDs, particularly driven by seasonal pathogens. This surveillance may provide additional intelligence about different categories of attendance, highlighting pressures in particular age groups, thereby aiding planning and preparation to respond to acute changes in EDs, and thus the health service in general.